Dec 04 09:39:14 crc systemd[1]: Starting Kubernetes Kubelet... Dec 04 09:39:14 crc restorecon[4742]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:14 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 09:39:15 crc restorecon[4742]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 04 09:39:15 crc kubenswrapper[4776]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:39:15 crc kubenswrapper[4776]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 09:39:15 crc kubenswrapper[4776]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:39:15 crc kubenswrapper[4776]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:39:15 crc kubenswrapper[4776]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 09:39:15 crc kubenswrapper[4776]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.263548 4776 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.265959 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.265976 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.265981 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.265984 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.265990 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.265995 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266000 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266004 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266009 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266014 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266019 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266023 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266028 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266032 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266036 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266040 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266044 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266048 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266052 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266058 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266072 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266076 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266080 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266084 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266089 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266092 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266096 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266100 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266103 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266107 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266110 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266115 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266118 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266122 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266126 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266129 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266133 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266136 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266140 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266143 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266148 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266153 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266157 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266162 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266167 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266171 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266174 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266178 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266181 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266187 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266191 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266195 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266199 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266203 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266206 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266210 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266213 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266217 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266220 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266224 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266227 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266230 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266234 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266237 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266241 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266244 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266247 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266251 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266254 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266258 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.266262 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266333 4776 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266341 4776 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266348 4776 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266355 4776 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266360 4776 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266364 4776 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266370 4776 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266375 4776 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266379 4776 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266383 4776 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266388 4776 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266392 4776 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266396 4776 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266400 4776 flags.go:64] FLAG: --cgroup-root="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266405 4776 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266409 4776 flags.go:64] FLAG: --client-ca-file="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266413 4776 flags.go:64] FLAG: --cloud-config="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266417 4776 flags.go:64] FLAG: --cloud-provider="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266421 4776 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266427 4776 flags.go:64] FLAG: --cluster-domain="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266431 4776 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266435 4776 flags.go:64] FLAG: --config-dir="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266439 4776 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266444 4776 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266451 4776 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266457 4776 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266462 4776 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266467 4776 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266472 4776 flags.go:64] FLAG: --contention-profiling="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266477 4776 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266481 4776 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266487 4776 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266492 4776 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266499 4776 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266504 4776 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266509 4776 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266514 4776 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266519 4776 flags.go:64] FLAG: --enable-server="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266524 4776 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266532 4776 flags.go:64] FLAG: --event-burst="100" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266538 4776 flags.go:64] FLAG: --event-qps="50" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266543 4776 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266548 4776 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266552 4776 flags.go:64] FLAG: --eviction-hard="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266557 4776 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266562 4776 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266566 4776 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266571 4776 flags.go:64] FLAG: --eviction-soft="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266575 4776 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266580 4776 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266584 4776 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266588 4776 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266592 4776 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266596 4776 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266600 4776 flags.go:64] FLAG: --feature-gates="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266605 4776 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266610 4776 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266614 4776 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266618 4776 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266622 4776 flags.go:64] FLAG: --healthz-port="10248" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266626 4776 flags.go:64] FLAG: --help="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266630 4776 flags.go:64] FLAG: --hostname-override="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266634 4776 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266638 4776 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266642 4776 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266646 4776 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266650 4776 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266654 4776 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266658 4776 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266662 4776 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266666 4776 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266670 4776 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266674 4776 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266678 4776 flags.go:64] FLAG: --kube-reserved="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266682 4776 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266686 4776 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266690 4776 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266694 4776 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266698 4776 flags.go:64] FLAG: --lock-file="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266702 4776 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266707 4776 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266710 4776 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266717 4776 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266720 4776 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266724 4776 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266728 4776 flags.go:64] FLAG: --logging-format="text" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266732 4776 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266737 4776 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266741 4776 flags.go:64] FLAG: --manifest-url="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266745 4776 flags.go:64] FLAG: --manifest-url-header="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266750 4776 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266755 4776 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266761 4776 flags.go:64] FLAG: --max-pods="110" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266765 4776 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266770 4776 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266774 4776 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266778 4776 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266782 4776 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266786 4776 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266791 4776 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266800 4776 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266809 4776 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266813 4776 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266817 4776 flags.go:64] FLAG: --pod-cidr="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266821 4776 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266827 4776 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266832 4776 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266836 4776 flags.go:64] FLAG: --pods-per-core="0" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266840 4776 flags.go:64] FLAG: --port="10250" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266844 4776 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266848 4776 flags.go:64] FLAG: --provider-id="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266852 4776 flags.go:64] FLAG: --qos-reserved="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266856 4776 flags.go:64] FLAG: --read-only-port="10255" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266860 4776 flags.go:64] FLAG: --register-node="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266864 4776 flags.go:64] FLAG: --register-schedulable="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266868 4776 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266875 4776 flags.go:64] FLAG: --registry-burst="10" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266879 4776 flags.go:64] FLAG: --registry-qps="5" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266883 4776 flags.go:64] FLAG: --reserved-cpus="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266887 4776 flags.go:64] FLAG: --reserved-memory="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266894 4776 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266898 4776 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266902 4776 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266906 4776 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266910 4776 flags.go:64] FLAG: --runonce="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266930 4776 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266934 4776 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266939 4776 flags.go:64] FLAG: --seccomp-default="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266943 4776 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266947 4776 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266951 4776 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266956 4776 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266960 4776 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266964 4776 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266969 4776 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266973 4776 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266977 4776 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266981 4776 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266985 4776 flags.go:64] FLAG: --system-cgroups="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266989 4776 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.266996 4776 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267000 4776 flags.go:64] FLAG: --tls-cert-file="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267004 4776 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267009 4776 flags.go:64] FLAG: --tls-min-version="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267013 4776 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267017 4776 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267021 4776 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267025 4776 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267029 4776 flags.go:64] FLAG: --v="2" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267035 4776 flags.go:64] FLAG: --version="false" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267040 4776 flags.go:64] FLAG: --vmodule="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267044 4776 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267050 4776 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267165 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267170 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267174 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267177 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267181 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267185 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267188 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267194 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267198 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267201 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267205 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267208 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267212 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267217 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267220 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267224 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267227 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267231 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267234 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267237 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267241 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267244 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267249 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267253 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267257 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267262 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267266 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267270 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267273 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267277 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267280 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267285 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267289 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267293 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267296 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267300 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267303 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267306 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267310 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267313 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267317 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267320 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267323 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267327 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267331 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267336 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267339 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267343 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267346 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267350 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267353 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267356 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267360 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267363 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267367 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267370 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267373 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267377 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267380 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267385 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267390 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267393 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267397 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267401 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267405 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267408 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267412 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267415 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267418 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267423 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.267428 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.267439 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.279657 4776 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.279701 4776 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279848 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279862 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279872 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279886 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279898 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279908 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279948 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279957 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279967 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279976 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279985 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.279997 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280008 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280018 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280027 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280036 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280045 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280053 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280064 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280073 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280083 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280092 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280102 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280112 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280121 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280137 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280146 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280155 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280164 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280175 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280183 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280195 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280204 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280213 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280223 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280232 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280241 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280250 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280259 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280269 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280278 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280287 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280297 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280306 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280316 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280328 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280339 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280350 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280360 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280370 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280380 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280393 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280404 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280416 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280427 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280437 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280447 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280457 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280467 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280478 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280488 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280497 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280506 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280515 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280525 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280534 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280543 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280552 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280560 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280570 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280579 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.280595 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280844 4776 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280858 4776 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280869 4776 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280880 4776 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280889 4776 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280898 4776 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280907 4776 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280940 4776 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280950 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280959 4776 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280968 4776 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280977 4776 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280986 4776 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.280998 4776 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281009 4776 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281019 4776 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281030 4776 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281040 4776 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281050 4776 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281060 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281069 4776 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281078 4776 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281088 4776 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281097 4776 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281107 4776 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281116 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281125 4776 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281134 4776 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281143 4776 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281152 4776 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281161 4776 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281169 4776 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281178 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281187 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281195 4776 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281205 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281214 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281226 4776 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281237 4776 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281246 4776 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281255 4776 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281265 4776 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281275 4776 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281284 4776 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281292 4776 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281302 4776 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281310 4776 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281320 4776 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281329 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281341 4776 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281352 4776 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281362 4776 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281372 4776 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281381 4776 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281390 4776 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281402 4776 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281416 4776 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281426 4776 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281437 4776 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281447 4776 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281458 4776 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281468 4776 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281479 4776 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281488 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281498 4776 feature_gate.go:330] unrecognized feature gate: Example Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281507 4776 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281517 4776 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281526 4776 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281535 4776 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281544 4776 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.281553 4776 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.281568 4776 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.281872 4776 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.286872 4776 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.287046 4776 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.287965 4776 server.go:997] "Starting client certificate rotation" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.288004 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.288840 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 11:27:41.447874721 +0000 UTC Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.289041 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.295828 4776 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.297396 4776 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.298386 4776 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.308057 4776 log.go:25] "Validated CRI v1 runtime API" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.325587 4776 log.go:25] "Validated CRI v1 image API" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.327686 4776 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.330441 4776 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-04-09-34-43-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.330489 4776 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.353338 4776 manager.go:217] Machine: {Timestamp:2025-12-04 09:39:15.351252928 +0000 UTC m=+0.217733355 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ae4f41f6-942c-4e00-b556-5f0151068ad6 BootID:19685a11-7601-4b6d-a386-4bf61b88c87c Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5e:12:5e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5e:12:5e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ad:98:c8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cd:9f:8c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a7:47:54 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:10:21:58 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e2:d0:92 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6a:dd:d2:f7:0b:e8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:72:65:b4:2f:19 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.353974 4776 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.354346 4776 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.354879 4776 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.355317 4776 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.355468 4776 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.356081 4776 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.356199 4776 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.356566 4776 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.356726 4776 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.357230 4776 state_mem.go:36] "Initialized new in-memory state store" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.357473 4776 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.358510 4776 kubelet.go:418] "Attempting to sync node with API server" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.358667 4776 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.358801 4776 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.358953 4776 kubelet.go:324] "Adding apiserver pod source" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.359083 4776 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.360695 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.360720 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.360976 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.361033 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.361227 4776 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.361641 4776 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.362455 4776 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363005 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363041 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363051 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363058 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363069 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363076 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363083 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363094 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363104 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363112 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363125 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363135 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.363352 4776 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.364050 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.364370 4776 server.go:1280] "Started kubelet" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.364794 4776 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.364780 4776 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.365732 4776 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.367049 4776 server.go:460] "Adding debug handlers to kubelet server" Dec 04 09:39:15 crc systemd[1]: Started Kubernetes Kubelet. Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368138 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368191 4776 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.367702 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187df9ae267b314c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:39:15.364335948 +0000 UTC m=+0.230816325,LastTimestamp:2025-12-04 09:39:15.364335948 +0000 UTC m=+0.230816325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368497 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 21:30:47.094164801 +0000 UTC Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368552 4776 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 179h51m31.72561886s for next certificate rotation Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368775 4776 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368794 4776 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.368976 4776 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.369414 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.369912 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.370028 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.370157 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="200ms" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.370642 4776 factory.go:55] Registering systemd factory Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.370676 4776 factory.go:221] Registration of the systemd container factory successfully Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.371329 4776 factory.go:153] Registering CRI-O factory Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.371365 4776 factory.go:221] Registration of the crio container factory successfully Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.371480 4776 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.371526 4776 factory.go:103] Registering Raw factory Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.371560 4776 manager.go:1196] Started watching for new ooms in manager Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.376097 4776 manager.go:319] Starting recovery of all containers Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.392869 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.393233 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.393403 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.393607 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.393791 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.394011 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.394200 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.394414 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.394610 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.394804 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.395026 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.395208 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.395423 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.395604 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.395820 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.396164 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.396366 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.396521 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.396692 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.396895 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.397149 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.397309 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.397528 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.397707 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.397872 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.398113 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.398319 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.398530 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.398765 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.398973 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.399203 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.396747 4776 manager.go:324] Recovery completed Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.399406 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.399812 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400028 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400199 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400382 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400540 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400662 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400779 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.400880 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401026 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401138 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401241 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401348 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401459 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401570 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401778 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.401894 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402060 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402171 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402282 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402397 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402525 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402636 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402753 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.402863 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403013 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403134 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403242 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403352 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403453 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403553 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403692 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403803 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.403951 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.404070 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.404198 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.404312 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.404428 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.404536 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.406556 4776 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.406716 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.406850 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.406989 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.407107 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.407223 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.407328 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.407456 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.407562 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.407714 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.409911 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.410620 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411083 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411236 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411340 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411432 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411538 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411619 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411762 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411846 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.411986 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412095 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412177 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412283 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412366 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412465 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412544 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412638 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412718 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412823 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.412903 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413070 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413194 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413278 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413361 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413460 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413589 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413677 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413758 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413844 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.413951 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.414050 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.414130 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.414209 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.414301 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.414809 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.414905 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415043 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415141 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415243 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415355 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415437 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415536 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415615 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415746 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415858 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.415993 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416090 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416196 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416274 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416399 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416497 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416575 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416674 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416757 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.416857 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417012 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417660 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417728 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417756 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417774 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417792 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417807 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417822 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417105 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417838 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.417999 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418019 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418036 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418051 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418064 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418081 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418099 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418113 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418129 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418144 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418163 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418181 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418199 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418220 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418241 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418263 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418281 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418299 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418316 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418332 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418352 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418368 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418386 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418405 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418422 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418437 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418455 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418472 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418487 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418504 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418521 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418541 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418559 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418593 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418615 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418633 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418653 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418672 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418694 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418713 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418731 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418748 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418766 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418785 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418805 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418826 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418845 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418866 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418884 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418903 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418951 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418973 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.418994 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419016 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419035 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419056 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419077 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419096 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419115 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419133 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419157 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419176 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419197 4776 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419218 4776 reconstruct.go:97] "Volume reconstruction finished" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.419232 4776 reconciler.go:26] "Reconciler: start to sync state" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.421965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.422018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.422032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.423094 4776 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.423130 4776 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.423162 4776 state_mem.go:36] "Initialized new in-memory state store" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.434506 4776 policy_none.go:49] "None policy: Start" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.438036 4776 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.438338 4776 state_mem.go:35] "Initializing new in-memory state store" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.445187 4776 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.450608 4776 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.450751 4776 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.450853 4776 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.451061 4776 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.452193 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.452260 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.469718 4776 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.509936 4776 manager.go:334] "Starting Device Plugin manager" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.510116 4776 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.510144 4776 server.go:79] "Starting device plugin registration server" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.510670 4776 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.510696 4776 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.511242 4776 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.511352 4776 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.511369 4776 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.524633 4776 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.551433 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.551560 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.552657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.552727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.552752 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.552991 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.553392 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.553450 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554073 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554180 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554323 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554354 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.554991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.555357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.555395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.555414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.555531 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.555630 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.555660 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556800 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.556954 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557006 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557899 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557960 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.557976 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.558181 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.558256 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.559231 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.559265 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.559277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.570986 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="400ms" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.612074 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.613414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.613552 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.613645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.613760 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.614467 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.620994 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621051 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621087 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621151 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621194 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621309 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621341 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621373 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621405 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621448 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621482 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621513 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.621545 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.685479 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187df9ae267b314c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:39:15.364335948 +0000 UTC m=+0.230816325,LastTimestamp:2025-12-04 09:39:15.364335948 +0000 UTC m=+0.230816325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.722805 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723121 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723277 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723392 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723464 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723335 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723215 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723654 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723688 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723715 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723752 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723155 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723778 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723839 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723882 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723958 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723843 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.723990 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724006 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724132 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724184 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724084 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724211 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724179 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.724007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.814864 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.816592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.816660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.816685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.816725 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.817363 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.886894 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.896218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.915143 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7b667199186b8f9ee90955d247a0fc207af01efe86081f0f9cd1d26d8d22fa0c WatchSource:0}: Error finding container 7b667199186b8f9ee90955d247a0fc207af01efe86081f0f9cd1d26d8d22fa0c: Status 404 returned error can't find the container with id 7b667199186b8f9ee90955d247a0fc207af01efe86081f0f9cd1d26d8d22fa0c Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.918652 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.925601 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-060c6caa3714815076061821a23d01f24783ca86f4347bc98bd67904ee67d3e0 WatchSource:0}: Error finding container 060c6caa3714815076061821a23d01f24783ca86f4347bc98bd67904ee67d3e0: Status 404 returned error can't find the container with id 060c6caa3714815076061821a23d01f24783ca86f4347bc98bd67904ee67d3e0 Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.926378 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: I1204 09:39:15.933871 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.940864 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f8decf96fcee92d65ddf88fd87b7de4ace8c60e8350413472adc48f06542c660 WatchSource:0}: Error finding container f8decf96fcee92d65ddf88fd87b7de4ace8c60e8350413472adc48f06542c660: Status 404 returned error can't find the container with id f8decf96fcee92d65ddf88fd87b7de4ace8c60e8350413472adc48f06542c660 Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.942819 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2614c1249e16e716538096fc2fab9f52941883102cf76fed4c98812a9e4bf6ad WatchSource:0}: Error finding container 2614c1249e16e716538096fc2fab9f52941883102cf76fed4c98812a9e4bf6ad: Status 404 returned error can't find the container with id 2614c1249e16e716538096fc2fab9f52941883102cf76fed4c98812a9e4bf6ad Dec 04 09:39:15 crc kubenswrapper[4776]: W1204 09:39:15.965560 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-41b7a3414a068324ad4119ec46ffbb1161cbbc46afe64c1e30a4de97dd78da24 WatchSource:0}: Error finding container 41b7a3414a068324ad4119ec46ffbb1161cbbc46afe64c1e30a4de97dd78da24: Status 404 returned error can't find the container with id 41b7a3414a068324ad4119ec46ffbb1161cbbc46afe64c1e30a4de97dd78da24 Dec 04 09:39:15 crc kubenswrapper[4776]: E1204 09:39:15.971765 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="800ms" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.217483 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.219105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.219140 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.219152 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.219176 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:39:16 crc kubenswrapper[4776]: E1204 09:39:16.219796 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Dec 04 09:39:16 crc kubenswrapper[4776]: W1204 09:39:16.357377 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:16 crc kubenswrapper[4776]: E1204 09:39:16.357468 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.365315 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.459151 4776 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="39644a64389b34550a9f73e556fc282668d6f1756a9d379846858b60e51c366b" exitCode=0 Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.459249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"39644a64389b34550a9f73e556fc282668d6f1756a9d379846858b60e51c366b"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.459389 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7b667199186b8f9ee90955d247a0fc207af01efe86081f0f9cd1d26d8d22fa0c"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.459500 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.460634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.460695 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.460720 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.461162 4776 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c" exitCode=0 Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.461263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.461302 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"41b7a3414a068324ad4119ec46ffbb1161cbbc46afe64c1e30a4de97dd78da24"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.461415 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.462711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.462735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.462743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.464341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.464371 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8decf96fcee92d65ddf88fd87b7de4ace8c60e8350413472adc48f06542c660"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.466468 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0" exitCode=0 Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.466540 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.466564 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2614c1249e16e716538096fc2fab9f52941883102cf76fed4c98812a9e4bf6ad"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.466685 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.467736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.467792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.467815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.469076 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4" exitCode=0 Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.469098 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.469112 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"060c6caa3714815076061821a23d01f24783ca86f4347bc98bd67904ee67d3e0"} Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.469202 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.469970 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.470073 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.470096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.473188 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.474388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.474432 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:16 crc kubenswrapper[4776]: I1204 09:39:16.474448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:16 crc kubenswrapper[4776]: E1204 09:39:16.772730 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="1.6s" Dec 04 09:39:16 crc kubenswrapper[4776]: W1204 09:39:16.780194 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:16 crc kubenswrapper[4776]: E1204 09:39:16.780261 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:16 crc kubenswrapper[4776]: W1204 09:39:16.939393 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:16 crc kubenswrapper[4776]: E1204 09:39:16.939477 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:16 crc kubenswrapper[4776]: W1204 09:39:16.963146 4776 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Dec 04 09:39:16 crc kubenswrapper[4776]: E1204 09:39:16.963224 4776 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.020243 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.022124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.022173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.022187 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.022215 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.389938 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.472477 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f59a5cf437a7a791ddba733c843d7476e583c40b7163aaf67b3b97642d4b9172"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.472544 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.473263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.473292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.473305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.474657 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.474680 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.474690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.474694 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.475496 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.475537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.475548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.476491 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.476518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.476528 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.476563 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.480566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.480600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.480613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.482587 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.482621 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.482642 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.482657 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.483850 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1" exitCode=0 Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.483893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1"} Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.484054 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.484725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.484752 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:17 crc kubenswrapper[4776]: I1204 09:39:17.484764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.454238 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.490397 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4"} Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.490467 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.491464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.491495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.491508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.492512 4776 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3" exitCode=0 Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.492617 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3"} Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.492670 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.492688 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.492736 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.493330 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.493572 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.493973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.494906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.497267 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.497541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.497848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.569268 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:18 crc kubenswrapper[4776]: I1204 09:39:18.574386 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498293 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"39d941b5ac6cac8f936dda95900923d9d8db40ce85886b0078d4243b3f519f46"} Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e9c469b2ce66f6216401f0b7c7580885e45f733c0c483a3a07055f049a0dce78"} Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498366 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498404 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498429 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18f62c5b9c990faa3f9b4ab4f9dccc2a662214b9470e862c5a2afefb71fa14c7"} Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.498482 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c703f5e0aa21310b7d07c40c18efc6d01e991a76334706230f44b6aacc399e64"} Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.501664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.501697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.501712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.501840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.501851 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:19 crc kubenswrapper[4776]: I1204 09:39:19.501858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.510836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2df0701a260e80ea33d6ad4a787afb26bb9fe525c1073b187e759da8759431c3"} Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.510871 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.511013 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.511020 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.512490 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.512546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.512570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.512678 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.512737 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:20 crc kubenswrapper[4776]: I1204 09:39:20.512761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.454761 4776 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.454878 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.513128 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.514324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.514408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.514422 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.735148 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.735319 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.735354 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.736500 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.736581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:21 crc kubenswrapper[4776]: I1204 09:39:21.736600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:22 crc kubenswrapper[4776]: I1204 09:39:22.369234 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:39:22 crc kubenswrapper[4776]: I1204 09:39:22.369706 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:22 crc kubenswrapper[4776]: I1204 09:39:22.371205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:22 crc kubenswrapper[4776]: I1204 09:39:22.371244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:22 crc kubenswrapper[4776]: I1204 09:39:22.371253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.019813 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.020400 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.022111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.022178 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.022201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.373854 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.374054 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.374105 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.375659 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.375709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:23 crc kubenswrapper[4776]: I1204 09:39:23.375725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:24 crc kubenswrapper[4776]: I1204 09:39:24.328303 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:24 crc kubenswrapper[4776]: I1204 09:39:24.328472 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:24 crc kubenswrapper[4776]: I1204 09:39:24.329524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:24 crc kubenswrapper[4776]: I1204 09:39:24.329815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:24 crc kubenswrapper[4776]: I1204 09:39:24.329829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.035756 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.036208 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.038070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.038124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.038135 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.388708 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.388904 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.390284 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.390340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:25 crc kubenswrapper[4776]: I1204 09:39:25.390350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:25 crc kubenswrapper[4776]: E1204 09:39:25.525416 4776 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 09:39:27 crc kubenswrapper[4776]: E1204 09:39:27.024027 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 04 09:39:27 crc kubenswrapper[4776]: I1204 09:39:27.369634 4776 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 04 09:39:27 crc kubenswrapper[4776]: E1204 09:39:27.392077 4776 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 04 09:39:28 crc kubenswrapper[4776]: E1204 09:39:28.374162 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.612078 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.612162 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.624501 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.626169 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.626226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.626248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.626288 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.630384 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 09:39:28 crc kubenswrapper[4776]: I1204 09:39:28.630456 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 09:39:29 crc kubenswrapper[4776]: I1204 09:39:29.690738 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 04 09:39:29 crc kubenswrapper[4776]: I1204 09:39:29.690996 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:29 crc kubenswrapper[4776]: I1204 09:39:29.692414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:29 crc kubenswrapper[4776]: I1204 09:39:29.692484 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:29 crc kubenswrapper[4776]: I1204 09:39:29.692504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:29 crc kubenswrapper[4776]: I1204 09:39:29.717548 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 04 09:39:30 crc kubenswrapper[4776]: I1204 09:39:30.406168 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 04 09:39:30 crc kubenswrapper[4776]: I1204 09:39:30.543824 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:30 crc kubenswrapper[4776]: I1204 09:39:30.545026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:30 crc kubenswrapper[4776]: I1204 09:39:30.545230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:30 crc kubenswrapper[4776]: I1204 09:39:30.545418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.455534 4776 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.455622 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.546763 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.548093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.548124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.548136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.716581 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.737538 4776 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.743703 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.743883 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.745133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.745175 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.745183 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:31 crc kubenswrapper[4776]: I1204 09:39:31.749012 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:32 crc kubenswrapper[4776]: I1204 09:39:32.549077 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:32 crc kubenswrapper[4776]: I1204 09:39:32.550058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:32 crc kubenswrapper[4776]: I1204 09:39:32.550122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:32 crc kubenswrapper[4776]: I1204 09:39:32.550142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.026737 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.026931 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.028148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.028182 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.028192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.624193 4776 trace.go:236] Trace[1514232186]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:39:20.040) (total time: 13583ms): Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[1514232186]: ---"Objects listed" error: 13583ms (09:39:33.624) Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[1514232186]: [13.583876127s] [13.583876127s] END Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.624244 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.624850 4776 trace.go:236] Trace[619574415]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:39:19.880) (total time: 13743ms): Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[619574415]: ---"Objects listed" error: 13743ms (09:39:33.624) Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[619574415]: [13.743988273s] [13.743988273s] END Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.624868 4776 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.625334 4776 trace.go:236] Trace[993492944]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:39:18.954) (total time: 14670ms): Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[993492944]: ---"Objects listed" error: 14670ms (09:39:33.625) Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[993492944]: [14.670996366s] [14.670996366s] END Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.625345 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.626907 4776 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.627374 4776 trace.go:236] Trace[802972774]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 09:39:19.015) (total time: 14611ms): Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[802972774]: ---"Objects listed" error: 14611ms (09:39:33.627) Dec 04 09:39:33 crc kubenswrapper[4776]: Trace[802972774]: [14.611935925s] [14.611935925s] END Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.627422 4776 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 09:39:33 crc kubenswrapper[4776]: E1204 09:39:33.634800 4776 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.669831 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33526->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.669844 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33522->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.669963 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33526->192.168.126.11:17697: read: connection reset by peer" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.670023 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33522->192.168.126.11:17697: read: connection reset by peer" Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.671076 4776 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 04 09:39:33 crc kubenswrapper[4776]: I1204 09:39:33.671116 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.368928 4776 apiserver.go:52] "Watching apiserver" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.372393 4776 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.372749 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.373233 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.373283 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.373314 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.373311 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.373229 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.373569 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.373734 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.373797 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.374090 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.375346 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.376579 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.377396 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.377490 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.377796 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.377943 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.378168 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.378450 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.378981 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.405697 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.423540 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.439453 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.452868 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.464682 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.469980 4776 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.477212 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.487633 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532358 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532417 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532436 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532453 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532482 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532499 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532516 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532532 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532551 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532571 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532592 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532607 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532896 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532958 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.532975 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533122 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533156 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533175 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533193 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533207 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533224 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533281 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533323 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533338 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533379 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533395 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533428 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533443 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533464 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533480 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533495 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533511 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533525 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533540 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533556 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533571 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533587 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533606 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533640 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533657 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533676 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533695 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533715 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533731 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533749 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533766 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533783 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533802 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533822 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533839 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533885 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533902 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533933 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533950 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533965 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533981 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533997 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534031 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534074 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534091 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534107 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534158 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534176 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534194 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534209 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534225 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534241 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534260 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534281 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534296 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534317 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534332 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534350 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534370 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534390 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534409 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534425 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534442 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534457 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534472 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534487 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534505 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534520 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534537 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534554 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534569 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536567 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536599 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536618 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536654 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536670 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536687 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536719 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536737 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536757 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536775 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536794 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536811 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537190 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537231 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537256 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537291 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537318 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537341 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537360 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537378 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537398 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537420 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537443 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537463 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537505 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537868 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537895 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.538445 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.538485 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533221 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533753 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533777 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.533823 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534023 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534022 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534074 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534104 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534138 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534311 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534325 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534332 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534353 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534358 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534525 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534542 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534568 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534574 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534667 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534666 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534819 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534879 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534884 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534903 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534963 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.534980 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535050 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535124 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535204 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535293 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535384 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535393 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535616 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535646 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535683 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535713 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535791 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.535977 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536026 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536052 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.536442 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.538217 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.537751 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.538878 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.538937 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.539609 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.539761 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.539807 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.539839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.539985 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540182 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540227 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540250 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540270 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540294 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540316 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540358 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540382 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540403 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540427 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540449 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540582 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.540609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.544348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.544450 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.544542 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.544568 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545014 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545030 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545042 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545311 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545451 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545817 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.546280 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.546306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.546166 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.546836 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.546973 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.547260 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.549155 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551317 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551401 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551598 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551748 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.551886 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.552088 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.552265 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.552230 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.553347 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.554781 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.554940 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.554969 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555038 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555058 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555108 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555159 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555184 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555209 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555234 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555261 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555282 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555309 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555335 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555372 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555391 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555409 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555448 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555468 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555521 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555643 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555743 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555815 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.555911 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556049 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556125 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556177 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556221 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556271 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556319 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556368 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556463 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556517 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556568 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556612 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556658 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556698 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556786 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556833 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556879 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556950 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557004 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557042 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557134 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557172 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.545431 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562034 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556377 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556061 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556588 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.556836 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.563466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557171 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.557561 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.552192 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.558483 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.558498 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.558542 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.559046 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.559052 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.559551 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.559687 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.560477 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.560641 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.560836 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.560852 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.560889 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:39:35.059451265 +0000 UTC m=+19.925931642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.560952 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.561190 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.561475 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.561566 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.561694 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.561711 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.561827 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562027 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562051 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562625 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562743 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562838 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562862 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.563725 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.562951 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.563212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.563427 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.563610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.564027 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.564069 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.564295 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.564308 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.564810 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565701 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565724 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565808 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565837 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565904 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565945 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565970 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565993 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.565838 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566085 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566001 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566194 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566168 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566356 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566256 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566508 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566558 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.566850 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.567020 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.567160 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.567354 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.567640 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.567688 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.567782 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.567790 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568114 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568248 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568290 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568304 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568315 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.568367 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.568448 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:35.068425352 +0000 UTC m=+19.934905729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568809 4776 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.568215 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.569198 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.569254 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:35.069225007 +0000 UTC m=+19.935705374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.569488 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.569518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.570151 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.570492 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.570610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.570976 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.571757 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.572155 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.572168 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.573525 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.573985 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.574756 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575073 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575081 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575146 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575143 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575347 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575420 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575614 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575739 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576298 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575829 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.575934 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576207 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576509 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576606 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576760 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576819 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.576892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577015 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577118 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577212 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577315 4776 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577382 4776 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577471 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577968 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578036 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578050 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578063 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578074 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578086 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578097 4776 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578107 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578139 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578150 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578160 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578224 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578236 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578246 4776 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578256 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578265 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578274 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578283 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578295 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578304 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578313 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578323 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578354 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578366 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578376 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578385 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578632 4776 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.578650 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.577492 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579218 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579239 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579248 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579261 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579271 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579283 4776 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579294 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579305 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579318 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579329 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579339 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579350 4776 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579361 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579371 4776 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579382 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579392 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579402 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579462 4776 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579473 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579484 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579494 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579506 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579517 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579527 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579537 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579549 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579560 4776 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579572 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579582 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579595 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579605 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579615 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579627 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579637 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579647 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579657 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579666 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579676 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579685 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579695 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579704 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579714 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579724 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579733 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579770 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579782 4776 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579792 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579804 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579817 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579829 4776 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579842 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579851 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579860 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579868 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579877 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579886 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580056 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.579912 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580093 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580103 4776 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580112 4776 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580121 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580130 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580139 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580150 4776 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580161 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580404 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580416 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580425 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580433 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580442 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580450 4776 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580461 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580470 4776 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580478 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580487 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580495 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580504 4776 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580515 4776 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580524 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580533 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580542 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580551 4776 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580560 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580570 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580578 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580589 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580597 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580605 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580613 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580622 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580631 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580671 4776 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580680 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580690 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580701 4776 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580709 4776 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580717 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580726 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580736 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580744 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580754 4776 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580762 4776 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580771 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580780 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580788 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580796 4776 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580804 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580812 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580821 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580829 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580838 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580847 4776 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580855 4776 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580863 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580875 4776 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580882 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580890 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580900 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580908 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580939 4776 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580956 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580969 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580979 4776 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580987 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.580996 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.581005 4776 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.581013 4776 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.581021 4776 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.581030 4776 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.581040 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.581051 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.584816 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.587065 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.587118 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.587537 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.587831 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.588414 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.588479 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.588935 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.590816 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.590851 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591112 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591137 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591151 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591207 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:35.091187552 +0000 UTC m=+19.957667919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591299 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591317 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591331 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:34 crc kubenswrapper[4776]: E1204 09:39:34.591412 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:35.091393299 +0000 UTC m=+19.957873676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.591530 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.592147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.592542 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.593036 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.593063 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.594114 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.594248 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.594425 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.594751 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4" exitCode=255 Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.594302 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.594822 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4"} Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.596265 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.596558 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.605759 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.606026 4776 scope.go:117] "RemoveContainer" containerID="ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.610979 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.616487 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.620224 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.625092 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.626847 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.629859 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.644126 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.658960 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.671214 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681585 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681639 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681708 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681724 4776 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681738 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681751 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681765 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681780 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681792 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681803 4776 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681816 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681828 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681840 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681851 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681862 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681873 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681884 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681897 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681909 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681941 4776 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681952 4776 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681963 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681974 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681985 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.681995 4776 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682006 4776 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682017 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682027 4776 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682037 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682048 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682059 4776 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682070 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682082 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682093 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682289 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.682592 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.685209 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.692908 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 09:39:34 crc kubenswrapper[4776]: I1204 09:39:34.700404 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 09:39:34 crc kubenswrapper[4776]: W1204 09:39:34.713417 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-76cacd1a4afa7f925ef92bf339499f807bae4126e8ce15d047e121a4672ecf24 WatchSource:0}: Error finding container 76cacd1a4afa7f925ef92bf339499f807bae4126e8ce15d047e121a4672ecf24: Status 404 returned error can't find the container with id 76cacd1a4afa7f925ef92bf339499f807bae4126e8ce15d047e121a4672ecf24 Dec 04 09:39:34 crc kubenswrapper[4776]: W1204 09:39:34.719211 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9261a36bd5ddcc691fac9d2fee731af7ed9b140bec24f2d7ef5a110c6a7d06df WatchSource:0}: Error finding container 9261a36bd5ddcc691fac9d2fee731af7ed9b140bec24f2d7ef5a110c6a7d06df: Status 404 returned error can't find the container with id 9261a36bd5ddcc691fac9d2fee731af7ed9b140bec24f2d7ef5a110c6a7d06df Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.062105 4776 csr.go:261] certificate signing request csr-gbxqt is approved, waiting to be issued Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.076622 4776 csr.go:257] certificate signing request csr-gbxqt is issued Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.085066 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.085174 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.085207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.085334 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.085394 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.085314 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:39:36.085270898 +0000 UTC m=+20.951751275 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.085549 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:36.085528336 +0000 UTC m=+20.952008713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.085561 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:36.085555007 +0000 UTC m=+20.952035384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.186591 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.186680 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.186834 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.186860 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.186863 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.186965 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.186983 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.186874 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.187052 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:36.18702899 +0000 UTC m=+21.053509447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:35 crc kubenswrapper[4776]: E1204 09:39:35.187114 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:36.187089472 +0000 UTC m=+21.053569899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.288300 4776 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289028 4776 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289066 4776 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289090 4776 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289127 4776 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289030 4776 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289036 4776 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289046 4776 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289086 4776 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: W1204 09:39:35.289117 4776 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.455818 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.456375 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.457155 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.457785 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.458359 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.458886 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.461487 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.462169 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.463272 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.463822 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.464856 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.465620 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.466539 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.467041 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.468114 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.468720 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.469310 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.471390 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.472048 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.472629 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.473525 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.474132 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.475102 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.475714 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.476118 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.477103 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.478029 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.478189 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.478658 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.479235 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.480030 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.480491 4776 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.480590 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.482653 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.483140 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.483509 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.484997 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.486009 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.486538 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.487533 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.488218 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.489010 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.489570 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.490562 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.491622 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.492085 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.493014 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.493569 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.494640 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.495154 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.495673 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.496568 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.497313 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.497554 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.498260 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.498747 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.512968 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.532761 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.557290 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.576319 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.595860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.607568 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.609571 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.609874 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.610661 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9261a36bd5ddcc691fac9d2fee731af7ed9b140bec24f2d7ef5a110c6a7d06df"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.611511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.611548 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"76cacd1a4afa7f925ef92bf339499f807bae4126e8ce15d047e121a4672ecf24"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.612823 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.612865 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.612878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c74de709b0942827f6aad7ee74b60d16f7235e3a024f96a2cfd0e1b683a37a7"} Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.627731 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.645083 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.657884 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.677437 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.696311 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.711147 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.745983 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.803296 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.840516 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.861843 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.889212 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-d6wbt"] Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.889710 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-l99mn"] Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.889930 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.889965 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.890735 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.897197 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.907375 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.907628 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.907519 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.908865 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.909036 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.914324 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.924153 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.931678 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.949277 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.971703 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.994202 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c24f72b5-b018-4505-baa0-b5c4e6066364-hosts-file\") pod \"node-resolver-l99mn\" (UID: \"c24f72b5-b018-4505-baa0-b5c4e6066364\") " pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.994257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26cd7\" (UniqueName: \"kubernetes.io/projected/a57f7940-a976-4c85-bcb7-a1c24ba08266-kube-api-access-26cd7\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.994277 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a57f7940-a976-4c85-bcb7-a1c24ba08266-rootfs\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.994291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a57f7940-a976-4c85-bcb7-a1c24ba08266-proxy-tls\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.994304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a57f7940-a976-4c85-bcb7-a1c24ba08266-mcd-auth-proxy-config\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.994326 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vl5c\" (UniqueName: \"kubernetes.io/projected/c24f72b5-b018-4505-baa0-b5c4e6066364-kube-api-access-2vl5c\") pod \"node-resolver-l99mn\" (UID: \"c24f72b5-b018-4505-baa0-b5c4e6066364\") " pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:35 crc kubenswrapper[4776]: I1204 09:39:35.996862 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.022237 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.041371 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.063462 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.077674 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-04 09:34:35 +0000 UTC, rotation deadline is 2026-10-21 11:14:13.460233797 +0000 UTC Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.077741 4776 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7705h34m37.382495473s for next certificate rotation Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.089736 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.094616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.094729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c24f72b5-b018-4505-baa0-b5c4e6066364-hosts-file\") pod \"node-resolver-l99mn\" (UID: \"c24f72b5-b018-4505-baa0-b5c4e6066364\") " pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.094764 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.094792 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.094866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c24f72b5-b018-4505-baa0-b5c4e6066364-hosts-file\") pod \"node-resolver-l99mn\" (UID: \"c24f72b5-b018-4505-baa0-b5c4e6066364\") " pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.094979 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.095010 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26cd7\" (UniqueName: \"kubernetes.io/projected/a57f7940-a976-4c85-bcb7-a1c24ba08266-kube-api-access-26cd7\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.095012 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:39:38.094985145 +0000 UTC m=+22.961465542 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.095070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a57f7940-a976-4c85-bcb7-a1c24ba08266-rootfs\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.094990 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.095096 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:38.095069347 +0000 UTC m=+22.961549724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.095116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a57f7940-a976-4c85-bcb7-a1c24ba08266-proxy-tls\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.095140 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:38.095128569 +0000 UTC m=+22.961609006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.095161 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a57f7940-a976-4c85-bcb7-a1c24ba08266-mcd-auth-proxy-config\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.095190 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a57f7940-a976-4c85-bcb7-a1c24ba08266-rootfs\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.095208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vl5c\" (UniqueName: \"kubernetes.io/projected/c24f72b5-b018-4505-baa0-b5c4e6066364-kube-api-access-2vl5c\") pod \"node-resolver-l99mn\" (UID: \"c24f72b5-b018-4505-baa0-b5c4e6066364\") " pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.096116 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a57f7940-a976-4c85-bcb7-a1c24ba08266-mcd-auth-proxy-config\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.099689 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a57f7940-a976-4c85-bcb7-a1c24ba08266-proxy-tls\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.128859 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.140782 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26cd7\" (UniqueName: \"kubernetes.io/projected/a57f7940-a976-4c85-bcb7-a1c24ba08266-kube-api-access-26cd7\") pod \"machine-config-daemon-d6wbt\" (UID: \"a57f7940-a976-4c85-bcb7-a1c24ba08266\") " pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.145168 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vl5c\" (UniqueName: \"kubernetes.io/projected/c24f72b5-b018-4505-baa0-b5c4e6066364-kube-api-access-2vl5c\") pod \"node-resolver-l99mn\" (UID: \"c24f72b5-b018-4505-baa0-b5c4e6066364\") " pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.176934 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.181093 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.195815 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.195877 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196025 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196043 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196056 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196103 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:38.196085456 +0000 UTC m=+23.062565833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196428 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196447 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196458 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.196485 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:38.196476968 +0000 UTC m=+23.062957345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.204277 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.211429 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l99mn" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.218656 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.274764 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.285798 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.292863 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.300265 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7xv6z"] Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.300625 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.302762 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.303595 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.303760 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.305248 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.307614 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.317155 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vzlvd"] Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.319938 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.322458 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q6zk4"] Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.323454 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.324362 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.324414 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.329116 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.330367 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.330443 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.330605 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.330721 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.330731 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.330936 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.347408 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.357579 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.368006 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.381485 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398375 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-socket-dir-parent\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-cni-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398440 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mbf\" (UniqueName: \"kubernetes.io/projected/253a8526-0cc1-4441-a032-6f8f96b66f40-kube-api-access-99mbf\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398461 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-etc-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-ovn\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398494 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-cni-bin\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398565 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-kubelet\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-script-lib\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-netns\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-multus-certs\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398695 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/423f8d5c-40c6-4efe-935f-7a9373d6becd-cni-binary-copy\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398717 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vpnn\" (UniqueName: \"kubernetes.io/projected/fdc73cf8-973a-4254-9339-6c9f90c225bb-kube-api-access-6vpnn\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398741 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-os-release\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/253a8526-0cc1-4441-a032-6f8f96b66f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398782 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398803 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-config\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-hostroot\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398873 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-netns\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398893 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-netd\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95p9z\" (UniqueName: \"kubernetes.io/projected/423f8d5c-40c6-4efe-935f-7a9373d6becd-kube-api-access-95p9z\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398969 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-kubelet\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.398988 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-cnibin\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-cni-multus\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399031 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-log-socket\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399057 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-system-cni-dir\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399084 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-systemd-units\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399107 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-var-lib-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-bin\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399161 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-os-release\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-conf-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399209 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-cnibin\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399232 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-slash\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-env-overrides\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399342 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399371 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-system-cni-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-daemon-config\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-etc-kubernetes\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovn-node-metrics-cert\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399483 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399515 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-k8s-cni-cncf-io\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/253a8526-0cc1-4441-a032-6f8f96b66f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399571 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-systemd\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.399599 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-node-log\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.400250 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.418201 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.434444 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.451269 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.451402 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.451755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.451808 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.451848 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.451888 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.453407 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.469332 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.490740 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500080 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-var-lib-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500136 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-bin\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500177 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-os-release\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-system-cni-dir\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-systemd-units\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500248 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-conf-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-system-cni-dir\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500288 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-systemd-units\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500241 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-var-lib-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500277 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-cnibin\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500356 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-slash\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500386 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-slash\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500404 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-conf-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500437 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-env-overrides\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-cnibin\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-system-cni-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-daemon-config\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500492 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500512 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-system-cni-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-etc-kubernetes\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500556 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-etc-kubernetes\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500576 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovn-node-metrics-cert\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500590 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-os-release\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500623 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500652 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500742 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/253a8526-0cc1-4441-a032-6f8f96b66f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500753 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500766 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-systemd\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500791 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-systemd\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500806 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-node-log\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500854 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-k8s-cni-cncf-io\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500888 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-socket-dir-parent\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-node-log\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500933 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-etc-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500954 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-k8s-cni-cncf-io\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.500969 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-cni-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501005 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-socket-dir-parent\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501018 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99mbf\" (UniqueName: \"kubernetes.io/projected/253a8526-0cc1-4441-a032-6f8f96b66f40-kube-api-access-99mbf\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-cni-bin\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501056 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-etc-openvswitch\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-kubelet\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501096 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-cni-bin\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-ovn\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501127 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-ovn\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501152 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-netns\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501165 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-kubelet\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501178 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-multus-certs\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-script-lib\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501205 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-netns\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501025 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-cni-dir\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/423f8d5c-40c6-4efe-935f-7a9373d6becd-cni-binary-copy\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501255 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-run-multus-certs\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vpnn\" (UniqueName: \"kubernetes.io/projected/fdc73cf8-973a-4254-9339-6c9f90c225bb-kube-api-access-6vpnn\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501298 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501324 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-config\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-hostroot\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-os-release\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/253a8526-0cc1-4441-a032-6f8f96b66f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501406 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-env-overrides\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501429 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95p9z\" (UniqueName: \"kubernetes.io/projected/423f8d5c-40c6-4efe-935f-7a9373d6becd-kube-api-access-95p9z\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-kubelet\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501530 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-netns\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501547 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-netd\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501565 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-log-socket\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501576 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501593 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-cnibin\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-cni-multus\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/253a8526-0cc1-4441-a032-6f8f96b66f40-os-release\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501592 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/253a8526-0cc1-4441-a032-6f8f96b66f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501702 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-host-var-lib-cni-multus\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-kubelet\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501739 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-log-socket\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501744 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-cnibin\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501301 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/423f8d5c-40c6-4efe-935f-7a9373d6becd-multus-daemon-config\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501751 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-script-lib\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-netns\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501754 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-netd\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501797 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/423f8d5c-40c6-4efe-935f-7a9373d6becd-hostroot\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501937 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-bin\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.501950 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/423f8d5c-40c6-4efe-935f-7a9373d6becd-cni-binary-copy\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.502213 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/253a8526-0cc1-4441-a032-6f8f96b66f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.502230 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-config\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.510301 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.510591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovn-node-metrics-cert\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.522167 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95p9z\" (UniqueName: \"kubernetes.io/projected/423f8d5c-40c6-4efe-935f-7a9373d6becd-kube-api-access-95p9z\") pod \"multus-7xv6z\" (UID: \"423f8d5c-40c6-4efe-935f-7a9373d6becd\") " pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.522226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99mbf\" (UniqueName: \"kubernetes.io/projected/253a8526-0cc1-4441-a032-6f8f96b66f40-kube-api-access-99mbf\") pod \"multus-additional-cni-plugins-vzlvd\" (UID: \"253a8526-0cc1-4441-a032-6f8f96b66f40\") " pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.525790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vpnn\" (UniqueName: \"kubernetes.io/projected/fdc73cf8-973a-4254-9339-6c9f90c225bb-kube-api-access-6vpnn\") pod \"ovnkube-node-q6zk4\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.526250 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.540768 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.558553 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.572607 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.584700 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.595428 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.597552 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.599860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.616651 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.616709 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.616745 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"ece9a7907ac69661a47427c454b4b880f955913556a6c72d641e10cbff1706e8"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.618452 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l99mn" event={"ID":"c24f72b5-b018-4505-baa0-b5c4e6066364","Type":"ContainerStarted","Data":"f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.618539 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l99mn" event={"ID":"c24f72b5-b018-4505-baa0-b5c4e6066364","Type":"ContainerStarted","Data":"1c65bb7ec0531d0f3e58ad80292b9ffd005c92f2e4f18d1e184ddc69b614e0ee"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.623043 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.627867 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.635898 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7xv6z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.638369 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.643366 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.651316 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.657203 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.674000 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: W1204 09:39:36.676299 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc73cf8_973a_4254_9339_6c9f90c225bb.slice/crio-afe785251610872c9543291e705094727e6baef0f06e8dfb4e99cdaa291e722c WatchSource:0}: Error finding container afe785251610872c9543291e705094727e6baef0f06e8dfb4e99cdaa291e722c: Status 404 returned error can't find the container with id afe785251610872c9543291e705094727e6baef0f06e8dfb4e99cdaa291e722c Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.695278 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.696013 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.722056 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.741184 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.763080 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.793815 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.817801 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.825864 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.832141 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.835473 4776 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.838328 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.838375 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.840313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.840456 4776 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.856432 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.858596 4776 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.859007 4776 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.862890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.862954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.862966 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.862991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.863008 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:36Z","lastTransitionTime":"2025-12-04T09:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.911204 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.915703 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.919878 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.919971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.919984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.920006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.920020 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:36Z","lastTransitionTime":"2025-12-04T09:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.927770 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.933849 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.937629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.937671 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.937681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.937700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.937710 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:36Z","lastTransitionTime":"2025-12-04T09:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.948013 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.953126 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.957613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.957827 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.957954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.958069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.958234 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:36Z","lastTransitionTime":"2025-12-04T09:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.964317 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.971793 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.977354 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.977614 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.977715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.977802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.977883 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:36Z","lastTransitionTime":"2025-12-04T09:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.981090 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.996438 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: E1204 09:39:36.996934 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.996497 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:36Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.999083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.999123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.999136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.999155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:36 crc kubenswrapper[4776]: I1204 09:39:36.999168 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:36Z","lastTransitionTime":"2025-12-04T09:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.101568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.101621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.101634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.101653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.101667 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.204010 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.204052 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.204063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.204083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.204093 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.307836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.307883 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.307894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.307931 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.307943 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.410637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.410684 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.410694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.410713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.410728 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.513270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.513603 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.513617 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.513638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.513650 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.616945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.616991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.617001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.617018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.617029 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.623142 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerStarted","Data":"4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.623212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerStarted","Data":"5cec6581f7a5855d67256654a802efb469302a070427092b239271b80d6db2de"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.624902 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a8526-0cc1-4441-a032-6f8f96b66f40" containerID="805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37" exitCode=0 Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.624953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerDied","Data":"805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.625231 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerStarted","Data":"5507275c0392577c7996bf1d2265c99ca993ec9e636fc28ff494b7fcc9048f0b"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.627471 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.628864 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" exitCode=0 Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.628896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.628965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"afe785251610872c9543291e705094727e6baef0f06e8dfb4e99cdaa291e722c"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.637235 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.653470 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.668728 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.685081 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.699763 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.717895 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.722777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.722817 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.722843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.722867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.722881 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.741882 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.756077 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.769466 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.810148 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.828019 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.828077 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.828151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.828167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.828189 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.828206 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.849556 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.871709 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.883048 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.901160 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.915478 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.927942 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.932220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.932267 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.932277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.932295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.932308 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:37Z","lastTransitionTime":"2025-12-04T09:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.941649 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.953639 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.968359 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.983011 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:37 crc kubenswrapper[4776]: I1204 09:39:37.995183 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:37Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.008150 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.019462 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.034490 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.034531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.034541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.034568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.034578 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.124254 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.124394 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.124440 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:39:42.124413364 +0000 UTC m=+26.990893741 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.124469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.124537 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.124555 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.124617 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:42.12460434 +0000 UTC m=+26.991084717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.124635 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:42.124627681 +0000 UTC m=+26.991108048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.137009 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.137046 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.137056 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.137081 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.137090 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.225229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.225293 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225473 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225498 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225512 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225582 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:42.225562027 +0000 UTC m=+27.092042404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225582 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225634 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225657 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.225757 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:42.225726522 +0000 UTC m=+27.092206939 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.240024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.240100 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.240124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.240157 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.240180 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.343408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.343469 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.343487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.343519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.343537 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.447325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.447630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.447646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.447670 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.447682 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.451573 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.451616 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.451662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.451714 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.451832 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:38 crc kubenswrapper[4776]: E1204 09:39:38.451996 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.461008 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.470301 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.472857 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.479545 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.494380 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.508028 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.528589 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.545055 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.550149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.550192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.550235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.550252 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.550261 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.560505 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.573190 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.586413 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.609931 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.626660 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.635177 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.635234 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.635244 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.637423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerStarted","Data":"498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.644582 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.653021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.653089 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.653102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.653128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.653143 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.665978 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.678822 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.696077 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.716048 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.733188 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.745887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.755989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.756033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.756045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.756063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.756076 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.761816 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.785570 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.798757 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.811184 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.823033 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.836813 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.851986 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.859638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.859675 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.859684 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.859702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.859714 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.871442 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:38Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.963326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.963367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.963378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.963399 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:38 crc kubenswrapper[4776]: I1204 09:39:38.963412 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:38Z","lastTransitionTime":"2025-12-04T09:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.066103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.066681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.066794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.066879 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.067011 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.169357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.169396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.169404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.169422 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.169434 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.271981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.272046 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.272061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.272086 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.272099 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.301708 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6hbgv"] Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.302139 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.305975 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.306118 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.306198 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.305975 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.316278 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.333847 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.336544 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7964ed5-5863-48f0-a329-1ff880943f79-host\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.336593 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm5p9\" (UniqueName: \"kubernetes.io/projected/c7964ed5-5863-48f0-a329-1ff880943f79-kube-api-access-jm5p9\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.336624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c7964ed5-5863-48f0-a329-1ff880943f79-serviceca\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.345842 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.358454 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.371398 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.374494 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.374526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.374539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.374560 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.374574 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.431412 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.437087 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm5p9\" (UniqueName: \"kubernetes.io/projected/c7964ed5-5863-48f0-a329-1ff880943f79-kube-api-access-jm5p9\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.437123 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c7964ed5-5863-48f0-a329-1ff880943f79-serviceca\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.437181 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7964ed5-5863-48f0-a329-1ff880943f79-host\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.437244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7964ed5-5863-48f0-a329-1ff880943f79-host\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.438209 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c7964ed5-5863-48f0-a329-1ff880943f79-serviceca\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.456036 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.464244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm5p9\" (UniqueName: \"kubernetes.io/projected/c7964ed5-5863-48f0-a329-1ff880943f79-kube-api-access-jm5p9\") pod \"node-ca-6hbgv\" (UID: \"c7964ed5-5863-48f0-a329-1ff880943f79\") " pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.471269 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.477492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.477824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.477963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.478082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.478189 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.487467 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.501464 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.515035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.528983 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.542110 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.552101 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.581210 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.581479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.581538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.581602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.581672 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.616544 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6hbgv" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.640893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6hbgv" event={"ID":"c7964ed5-5863-48f0-a329-1ff880943f79","Type":"ContainerStarted","Data":"f83362f584833fed199760d469626aeb747152bff9450c26da207b3a74b29cab"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.644394 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.644419 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.644431 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.646665 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a8526-0cc1-4441-a032-6f8f96b66f40" containerID="498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43" exitCode=0 Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.646693 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerDied","Data":"498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.662555 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.675224 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.685128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.685185 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.685199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.685220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.685237 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.690303 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.703627 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.717527 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.732744 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.745650 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.755994 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.783638 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.788291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.788347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.788361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.788386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.788401 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.821305 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.865385 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.890755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.890806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.890816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.890868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.890884 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.903424 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.947277 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.990709 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.993590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.993702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.993765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.993838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:39 crc kubenswrapper[4776]: I1204 09:39:39.993893 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:39Z","lastTransitionTime":"2025-12-04T09:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.096436 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.096474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.096483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.096499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.096511 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.199150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.199202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.199219 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.199243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.199263 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.302348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.302397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.302412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.302435 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.302450 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.405412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.405470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.405482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.405502 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.405515 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.452169 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.452242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.452245 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:40 crc kubenswrapper[4776]: E1204 09:39:40.452647 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:40 crc kubenswrapper[4776]: E1204 09:39:40.452798 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:40 crc kubenswrapper[4776]: E1204 09:39:40.452949 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.507767 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.507811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.507821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.507840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.507852 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.610750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.610804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.610821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.610841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.610854 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.652493 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a8526-0cc1-4441-a032-6f8f96b66f40" containerID="8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94" exitCode=0 Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.652585 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerDied","Data":"8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.653727 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6hbgv" event={"ID":"c7964ed5-5863-48f0-a329-1ff880943f79","Type":"ContainerStarted","Data":"7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.667502 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.678350 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.692460 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.704142 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.713904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.713981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.713997 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.714020 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.714036 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.719692 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.743242 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.757568 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.775149 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.792622 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.804899 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.819553 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.822001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.822047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.822056 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.822077 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.822087 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.836336 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.852371 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.867607 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.884194 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.901257 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.916058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.924433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.924487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.924498 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.924518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.924530 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:40Z","lastTransitionTime":"2025-12-04T09:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.935035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.946430 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.960207 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.973124 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.983216 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:40 crc kubenswrapper[4776]: I1204 09:39:40.991899 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:40Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.001626 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.026530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.026571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.026580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.026596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.026608 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.029422 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.057272 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.087217 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.109447 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.130337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.130376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.130387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.130406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.130416 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.233478 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.233518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.233529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.233548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.233565 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.336873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.336934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.336945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.336965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.336981 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.440403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.440461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.440477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.440499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.440512 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.544461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.544504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.544512 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.544527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.544537 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.647366 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.647423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.647437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.647459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.647472 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:41Z","lastTransitionTime":"2025-12-04T09:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.663362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.666332 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a8526-0cc1-4441-a032-6f8f96b66f40" containerID="4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b" exitCode=0 Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.666391 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerDied","Data":"4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b"} Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.683276 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.702446 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.716713 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.729792 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.741876 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.755210 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.770008 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.784408 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.796796 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.808885 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.824079 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.835255 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.846029 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:41 crc kubenswrapper[4776]: I1204 09:39:41.856153 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:41Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.291070 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.291839 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:39:50.291799073 +0000 UTC m=+35.158279450 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.291972 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.292038 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.292120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.292151 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293001 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293070 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293086 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293152 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:50.293142854 +0000 UTC m=+35.159623231 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293712 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293777 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293815 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:50.293783733 +0000 UTC m=+35.160264280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293841 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:50.293830125 +0000 UTC m=+35.160310712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293884 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293900 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293909 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.293965 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:39:50.293952529 +0000 UTC m=+35.160432906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.294137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.294163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.294174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.294193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.294213 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.400365 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.400404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.400412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.400429 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.400438 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.452427 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.452609 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.452695 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.452752 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.453435 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:42 crc kubenswrapper[4776]: E1204 09:39:42.453718 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.510038 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.510081 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.510092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.510110 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.510123 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.612833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.612891 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.612905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.612946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.612962 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.716175 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.716206 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.716214 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.716229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.716237 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.819307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.819546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.819606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.819703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.819760 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.923409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.923843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.924036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.924182 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:42 crc kubenswrapper[4776]: I1204 09:39:42.924339 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:42Z","lastTransitionTime":"2025-12-04T09:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.028664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.028725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.028743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.028769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.028785 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.132276 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.132327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.132341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.132361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.132373 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.234939 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.234982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.234996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.235014 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.235027 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.338448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.338489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.338501 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.338520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.338531 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.440788 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.441292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.441416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.441565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.441663 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.545105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.545176 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.545203 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.545237 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.545265 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.648980 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.649298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.649308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.649324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.649333 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.751939 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.751993 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.752025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.752047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.752059 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.854758 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.854806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.854817 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.854834 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.854847 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.957287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.957331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.957339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.957356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:43 crc kubenswrapper[4776]: I1204 09:39:43.957367 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:43Z","lastTransitionTime":"2025-12-04T09:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.060313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.060382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.060395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.060419 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.060433 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.163807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.163888 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.163953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.163984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.164003 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.270396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.270445 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.270455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.270472 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.270483 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.373350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.373396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.373405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.373434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.373446 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.452305 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.452359 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.452409 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:44 crc kubenswrapper[4776]: E1204 09:39:44.452504 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:44 crc kubenswrapper[4776]: E1204 09:39:44.452668 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:44 crc kubenswrapper[4776]: E1204 09:39:44.452849 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.476239 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.476291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.476301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.476320 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.476333 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.581544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.581587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.581607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.581628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.581642 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.683584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.683626 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.683637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.683672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.683684 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.685642 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.686128 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.690903 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerStarted","Data":"5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.700685 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.712123 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.726813 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.742404 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.762844 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.777546 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.786860 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.786897 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.786907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.786934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.786945 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.791536 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.803887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.816764 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.832120 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.847684 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.861029 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.876703 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.890104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.890149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.890160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.890181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.890211 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.890195 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.893412 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.913183 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.924652 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.940257 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.966594 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.982566 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.992272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.992321 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.992334 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.992355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.992369 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:44Z","lastTransitionTime":"2025-12-04T09:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:44 crc kubenswrapper[4776]: I1204 09:39:44.997957 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.011466 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.026324 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.041841 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.044226 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.061100 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.076014 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.091416 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.095937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.096022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.096037 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.096084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.096101 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.110320 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.126124 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.142754 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.157452 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.172521 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.186465 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.199179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.199222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.199233 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.199251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.199263 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.200291 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.222155 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.238053 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.253164 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.271060 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.285011 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.302255 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.302315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.302330 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.302353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.302365 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.304860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.321925 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.336832 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.351529 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.433284 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.433333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.433345 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.433364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.433377 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.466560 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.479234 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.493553 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.506804 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.523423 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.541992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.542047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.542060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.542080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.542096 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.552211 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.572147 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.588287 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.605077 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.620969 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.635760 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.644456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.644523 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.644536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.644557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.644570 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.652074 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.665248 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.674673 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.696963 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a8526-0cc1-4441-a032-6f8f96b66f40" containerID="5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a" exitCode=0 Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.697051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerDied","Data":"5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.697120 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.697535 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.710756 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.725430 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.726288 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.738483 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.748063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.748099 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.748111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.748128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.748140 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.750046 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.763711 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.788081 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.800524 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.812451 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.825535 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.837044 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.850397 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.851015 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.851070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.851083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.851102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.851115 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.866351 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.879107 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.891191 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.907257 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.921172 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.930903 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.942303 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.954674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.955608 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.955632 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.955656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.955677 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:45Z","lastTransitionTime":"2025-12-04T09:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.955953 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:45 crc kubenswrapper[4776]: I1204 09:39:45.991534 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.015513 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.038997 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.054551 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.058981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.059022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.059033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.059051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.059063 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.068156 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.079618 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.093567 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.106894 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.116035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.161423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.161459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.161466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.161481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.161490 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.264397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.264707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.264717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.264733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.264744 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.367770 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.367801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.367809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.367824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.367834 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.451347 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.451397 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.451346 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:46 crc kubenswrapper[4776]: E1204 09:39:46.451492 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:46 crc kubenswrapper[4776]: E1204 09:39:46.451550 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:46 crc kubenswrapper[4776]: E1204 09:39:46.451605 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.470658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.470700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.470711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.470728 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.470739 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.573479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.573528 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.573538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.573555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.573569 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.676224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.676263 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.676273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.676310 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.676320 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.704834 4776 generic.go:334] "Generic (PLEG): container finished" podID="253a8526-0cc1-4441-a032-6f8f96b66f40" containerID="48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e" exitCode=0 Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.704892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerDied","Data":"48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.705020 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.718750 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.738967 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.760340 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.775078 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.782705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.782756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.782771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.782793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.782807 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.794285 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.813637 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.832588 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.854646 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.869196 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.884429 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.885012 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.885044 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.885057 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.885078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.885091 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.899182 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.915697 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.932098 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.946860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.992423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.992489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.992506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.992530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:46 crc kubenswrapper[4776]: I1204 09:39:46.992546 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:46Z","lastTransitionTime":"2025-12-04T09:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.096171 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.096248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.096268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.096291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.096306 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.199550 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.199613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.199630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.199650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.199661 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.303539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.303597 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.303609 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.303628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.303644 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.373199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.373242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.373254 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.373271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.373285 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: E1204 09:39:47.390808 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.396782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.396836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.396849 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.396872 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.396888 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: E1204 09:39:47.415091 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.419327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.419378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.419392 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.419415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.419428 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: E1204 09:39:47.433746 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.438669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.438727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.438737 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.438758 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.438769 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: E1204 09:39:47.452755 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.457055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.457107 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.457121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.457142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.457160 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: E1204 09:39:47.471790 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: E1204 09:39:47.472056 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.474640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.474664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.474673 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.474692 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.474709 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.578194 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.578250 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.578259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.578282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.578294 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.680896 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.680971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.680980 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.680999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.681014 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.710395 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/0.log" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.714225 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504" exitCode=1 Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.714339 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.714957 4776 scope.go:117] "RemoveContainer" containerID="639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.719340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" event={"ID":"253a8526-0cc1-4441-a032-6f8f96b66f40","Type":"ContainerStarted","Data":"4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.734780 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.753828 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.770671 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.786198 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.786256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.786271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.786295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.786309 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.788099 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.803316 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.820593 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.837195 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.849078 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.862907 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.876948 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.890278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.890362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.890386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.890407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.890443 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.890553 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.901155 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.915203 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.936156 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.948987 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.960073 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.979231 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.990433 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:47Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.994128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.994154 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.994163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.994179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:47 crc kubenswrapper[4776]: I1204 09:39:47.994189 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:47Z","lastTransitionTime":"2025-12-04T09:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.005162 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.021778 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.034514 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.048466 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.066966 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.090097 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.096188 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.096412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.096557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.096789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.097034 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.106876 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.130394 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.153251 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.170808 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.199722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.199787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.199797 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.199816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.199826 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.296012 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j"] Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.296578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.298799 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.299254 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.302270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.302311 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.302324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.302345 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.302358 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.315272 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.331835 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.345217 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.358276 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.373645 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.386266 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.389443 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f8a5a83-82d6-4af1-9afa-816275ced3a8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.389507 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wp6l\" (UniqueName: \"kubernetes.io/projected/9f8a5a83-82d6-4af1-9afa-816275ced3a8-kube-api-access-6wp6l\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.389813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f8a5a83-82d6-4af1-9afa-816275ced3a8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.390063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f8a5a83-82d6-4af1-9afa-816275ced3a8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.405095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.405452 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.405644 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.406063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.406224 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.407881 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.422817 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.435524 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.451358 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:48 crc kubenswrapper[4776]: E1204 09:39:48.451796 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.451465 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:48 crc kubenswrapper[4776]: E1204 09:39:48.452020 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.451498 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:48 crc kubenswrapper[4776]: E1204 09:39:48.452247 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.451328 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.477773 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.490260 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.490858 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f8a5a83-82d6-4af1-9afa-816275ced3a8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.490892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f8a5a83-82d6-4af1-9afa-816275ced3a8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.490943 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f8a5a83-82d6-4af1-9afa-816275ced3a8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.490978 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wp6l\" (UniqueName: \"kubernetes.io/projected/9f8a5a83-82d6-4af1-9afa-816275ced3a8-kube-api-access-6wp6l\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.491780 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f8a5a83-82d6-4af1-9afa-816275ced3a8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.491828 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f8a5a83-82d6-4af1-9afa-816275ced3a8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.501646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f8a5a83-82d6-4af1-9afa-816275ced3a8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.502186 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.509104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.509166 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.509184 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.509209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.509224 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.511772 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wp6l\" (UniqueName: \"kubernetes.io/projected/9f8a5a83-82d6-4af1-9afa-816275ced3a8-kube-api-access-6wp6l\") pod \"ovnkube-control-plane-749d76644c-vk56j\" (UID: \"9f8a5a83-82d6-4af1-9afa-816275ced3a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.515621 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.527669 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.610251 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.613505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.613577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.613592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.613615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.613629 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: W1204 09:39:48.623303 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8a5a83_82d6_4af1_9afa_816275ced3a8.slice/crio-5ef2c18cf22358e0795ae5359e399592369b0d7e2624212a256e49ce6ed47afd WatchSource:0}: Error finding container 5ef2c18cf22358e0795ae5359e399592369b0d7e2624212a256e49ce6ed47afd: Status 404 returned error can't find the container with id 5ef2c18cf22358e0795ae5359e399592369b0d7e2624212a256e49ce6ed47afd Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.716557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.716594 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.716604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.716622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.716633 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.724953 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/0.log" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.728003 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.728113 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.729238 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" event={"ID":"9f8a5a83-82d6-4af1-9afa-816275ced3a8","Type":"ContainerStarted","Data":"5ef2c18cf22358e0795ae5359e399592369b0d7e2624212a256e49ce6ed47afd"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.746736 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.760456 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.773107 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.791736 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.812860 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.819318 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.819370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.819385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.819408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.819420 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.827607 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.841081 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.853719 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.873836 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.888718 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.904894 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.919346 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.922527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.922577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.922598 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.922624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.922640 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:48Z","lastTransitionTime":"2025-12-04T09:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.933539 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:48 crc kubenswrapper[4776]: I1204 09:39:48.964681 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:48Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.002889 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.029340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.029397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.029411 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.029434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.029449 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.133657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.133745 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.133769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.133798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.133818 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.236959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.237022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.237043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.237071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.237088 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.340400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.340445 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.340464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.340484 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.340497 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.443226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.443303 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.443321 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.443351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.443370 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.546307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.546370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.546388 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.546413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.546440 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.649237 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.649278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.649289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.649307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.649320 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.739680 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.752510 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.752556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.752578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.752602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.752620 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.799525 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g5jzd"] Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.801430 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:49 crc kubenswrapper[4776]: E1204 09:39:49.801541 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.804163 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmz9v\" (UniqueName: \"kubernetes.io/projected/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-kube-api-access-wmz9v\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.804322 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.821162 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.843866 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.856827 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.856888 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.856904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.856948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.856965 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.863230 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.890507 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.906104 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmz9v\" (UniqueName: \"kubernetes.io/projected/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-kube-api-access-wmz9v\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.906191 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:49 crc kubenswrapper[4776]: E1204 09:39:49.906407 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:49 crc kubenswrapper[4776]: E1204 09:39:49.906527 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:39:50.406497119 +0000 UTC m=+35.272977506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.907591 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.928644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmz9v\" (UniqueName: \"kubernetes.io/projected/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-kube-api-access-wmz9v\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.929249 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.945776 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.957668 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.959963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.959981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.959989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.960005 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.960014 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:49Z","lastTransitionTime":"2025-12-04T09:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.973622 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:49 crc kubenswrapper[4776]: I1204 09:39:49.989261 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.005266 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.022487 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.036351 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.052821 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.063033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.063068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.063078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.063093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.063104 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.065319 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.080252 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.170607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.170667 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.170683 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.170710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.170726 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.273957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.274014 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.274024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.274043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.274059 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.311152 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.311391 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.311458 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.311506 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.311545 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.311711 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.311786 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:06.311761072 +0000 UTC m=+51.178241489 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.311869 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:40:06.311857485 +0000 UTC m=+51.178337892 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.311988 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312047 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:06.31202688 +0000 UTC m=+51.178507297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312267 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312309 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312332 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312371 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312449 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312478 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312404 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:06.312386081 +0000 UTC m=+51.178866498 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.312618 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:06.312566017 +0000 UTC m=+51.179046444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.376809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.376859 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.376873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.376893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.376907 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.412570 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.412890 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.413055 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:39:51.413020169 +0000 UTC m=+36.279500586 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.451465 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.451560 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.452079 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.452162 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.451602 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.452270 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.480474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.480551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.480560 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.480579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.480592 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.582842 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.582900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.582933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.582959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.582973 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.686377 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.686432 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.686445 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.686468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.686485 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.745498 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/1.log" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.746582 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/0.log" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.750058 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383" exitCode=1 Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.750159 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.750316 4776 scope.go:117] "RemoveContainer" containerID="639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.751232 4776 scope.go:117] "RemoveContainer" containerID="8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383" Dec 04 09:39:50 crc kubenswrapper[4776]: E1204 09:39:50.751459 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.753409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" event={"ID":"9f8a5a83-82d6-4af1-9afa-816275ced3a8","Type":"ContainerStarted","Data":"b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.753451 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" event={"ID":"9f8a5a83-82d6-4af1-9afa-816275ced3a8","Type":"ContainerStarted","Data":"80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.788809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.788857 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.788870 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.788892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.788907 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.790318 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.803519 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.814769 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.827441 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.847067 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.860511 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.872756 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.883785 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.891828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.891871 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.891882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.891899 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.891926 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.894221 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.909802 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.923442 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.938902 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.951189 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.963989 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.977035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.986336 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.995133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.995165 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.995174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.995190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.995202 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:50Z","lastTransitionTime":"2025-12-04T09:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:50 crc kubenswrapper[4776]: I1204 09:39:50.997952 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:50Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.011484 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.029905 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.044301 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.054975 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.070186 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.082324 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.092937 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.097480 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.097505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.097514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.097532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.097545 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.104490 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.118777 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.135385 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.147019 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.160607 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.176117 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.200018 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.200234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.200268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.200278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.200313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.200331 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.219145 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:51Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.303452 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.303647 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.303684 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.303721 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.303750 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.407026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.407072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.407086 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.407104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.407119 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.425348 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:51 crc kubenswrapper[4776]: E1204 09:39:51.425599 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:51 crc kubenswrapper[4776]: E1204 09:39:51.425742 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:39:53.425715797 +0000 UTC m=+38.292196334 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.454526 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:51 crc kubenswrapper[4776]: E1204 09:39:51.454670 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.509712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.509747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.509755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.509771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.509782 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.612835 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.612904 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.612967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.613008 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.613034 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.716481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.716557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.716581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.716614 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.716634 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.760187 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/1.log" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.820536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.820656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.820685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.820718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.820748 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.924826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.924889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.924908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.924967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:51 crc kubenswrapper[4776]: I1204 09:39:51.924993 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:51Z","lastTransitionTime":"2025-12-04T09:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.027824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.027882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.027900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.027956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.027972 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.131307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.131370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.131387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.131413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.131430 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.234378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.234455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.234477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.234504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.234524 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.337269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.337349 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.337393 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.337419 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.337442 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.440412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.440500 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.440526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.440556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.440578 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.451999 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.452055 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.451999 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:52 crc kubenswrapper[4776]: E1204 09:39:52.452233 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:52 crc kubenswrapper[4776]: E1204 09:39:52.452380 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:52 crc kubenswrapper[4776]: E1204 09:39:52.452523 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.544033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.544100 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.544118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.544143 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.544159 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.647464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.647558 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.647584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.647618 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.647652 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.751156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.751224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.751240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.751264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.751286 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.853787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.853846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.853856 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.853876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.853889 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.958081 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.958532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.958568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.958596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:52 crc kubenswrapper[4776]: I1204 09:39:52.958615 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:52Z","lastTransitionTime":"2025-12-04T09:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.061482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.061520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.061533 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.061555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.061571 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.165559 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.165627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.165651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.165681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.165705 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.268216 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.268251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.268260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.268274 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.268283 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.371709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.371786 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.371809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.371841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.371865 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.449430 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:53 crc kubenswrapper[4776]: E1204 09:39:53.449678 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:53 crc kubenswrapper[4776]: E1204 09:39:53.449854 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:39:57.449811583 +0000 UTC m=+42.316292010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.452359 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:53 crc kubenswrapper[4776]: E1204 09:39:53.452533 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.475153 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.475213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.475231 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.475258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.475281 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.578246 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.578330 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.578346 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.578368 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.578383 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.681908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.681983 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.681996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.682021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.682034 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.784982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.785075 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.785090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.785136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.785151 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.904021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.904061 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.904069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.904084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:53 crc kubenswrapper[4776]: I1204 09:39:53.904094 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:53Z","lastTransitionTime":"2025-12-04T09:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.007123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.007179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.007192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.007217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.007231 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.109927 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.109974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.109982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.110006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.110017 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.212261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.212314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.212326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.212344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.212354 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.315313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.315387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.315405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.315431 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.315448 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.419074 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.419134 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.419150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.419171 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.419190 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.452099 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.452162 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.452187 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:54 crc kubenswrapper[4776]: E1204 09:39:54.452379 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:54 crc kubenswrapper[4776]: E1204 09:39:54.452429 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:54 crc kubenswrapper[4776]: E1204 09:39:54.452497 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.522350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.522403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.522415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.522436 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.522451 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.626566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.626634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.626684 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.626713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.626730 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.729642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.729707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.729725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.729751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.729770 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.832314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.832389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.832411 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.832441 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.832460 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.935900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.936032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.936051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.936083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:54 crc kubenswrapper[4776]: I1204 09:39:54.936102 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:54Z","lastTransitionTime":"2025-12-04T09:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.039045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.039107 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.039122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.039149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.039167 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.142419 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.142485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.142503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.142533 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.142552 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.246340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.246408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.246426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.246454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.246472 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.350049 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.350111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.350131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.350156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.350186 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.451281 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:55 crc kubenswrapper[4776]: E1204 09:39:55.451529 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.453254 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.453323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.453345 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.453377 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.453400 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.473623 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.492987 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.510814 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.532171 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.550683 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.556371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.556409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.556417 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.556432 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.556446 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.575141 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://639ca1101a71b0cf2762513c946a08ff81669352633184dc15e7b49d8572c504\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"message\\\":\\\"pha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 09:39:46.916405 6010 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.916737 6010 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 09:39:46.917411 6010 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917450 6010 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.917519 6010 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 09:39:46.918453 6010 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 09:39:46.918499 6010 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 09:39:46.918531 6010 factory.go:656] Stopping watch factory\\\\nI1204 09:39:46.918554 6010 ovnkube.go:599] Stopped ovnkube\\\\nI1204 09:39:46.918590 6010 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 09:39:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.590201 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.602858 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.621740 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.639406 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.651903 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.658403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.658439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.658449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.658464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.658474 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.667853 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.677750 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.687177 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.697928 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.708966 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.763569 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.763602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.763611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.763625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.763634 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.867378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.867446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.867464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.867491 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.867509 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.882785 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.884030 4776 scope.go:117] "RemoveContainer" containerID="8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383" Dec 04 09:39:55 crc kubenswrapper[4776]: E1204 09:39:55.884404 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.899388 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.917260 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.929283 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.940602 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.959402 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.970934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.970988 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.971005 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.971030 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.971048 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:55Z","lastTransitionTime":"2025-12-04T09:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.983177 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:55 crc kubenswrapper[4776]: I1204 09:39:55.998151 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.011191 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.024984 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.042617 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.056337 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.068981 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.077173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.077339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.077454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.077538 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.077596 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.083171 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.098897 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.111639 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.125394 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:56Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.181035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.181101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.181122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.181149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.181168 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.284097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.284143 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.284155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.284176 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.284192 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.386247 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.386287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.386296 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.386313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.386323 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.451911 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.451970 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:56 crc kubenswrapper[4776]: E1204 09:39:56.452546 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.452006 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:56 crc kubenswrapper[4776]: E1204 09:39:56.452765 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:56 crc kubenswrapper[4776]: E1204 09:39:56.453000 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.489119 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.489201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.489224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.489257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.489284 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.593085 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.593149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.593170 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.593199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.593216 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.696670 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.696734 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.696753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.696780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.696797 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.799504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.799839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.800055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.800188 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.800311 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.903548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.903875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.903987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.904093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:56 crc kubenswrapper[4776]: I1204 09:39:56.904167 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:56Z","lastTransitionTime":"2025-12-04T09:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.007138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.007183 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.007196 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.007215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.007228 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.109795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.109864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.109874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.109892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.109902 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.213174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.213245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.213262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.213287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.213306 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.316714 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.316771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.316789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.316816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.316834 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.419121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.419191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.419209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.419234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.419251 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.451876 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.452370 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.499335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.499577 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.499698 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:40:05.499665676 +0000 UTC m=+50.366146093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.523099 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.523173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.523192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.523218 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.523239 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.617505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.617781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.617873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.617983 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.618062 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.636059 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:57Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.640503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.640572 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.640588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.640613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.640630 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.657860 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:57Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.664943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.665013 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.665031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.665069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.665087 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.679500 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:57Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.684371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.684433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.684458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.684490 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.684513 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.700211 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:57Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.704136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.704178 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.704191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.704210 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.704222 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.718654 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:39:57Z is after 2025-08-24T17:21:41Z" Dec 04 09:39:57 crc kubenswrapper[4776]: E1204 09:39:57.718816 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.720344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.720371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.720381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.720397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.720407 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.823514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.823554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.823565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.823581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.823591 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.926755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.926802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.926811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.926829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:57 crc kubenswrapper[4776]: I1204 09:39:57.926840 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:57Z","lastTransitionTime":"2025-12-04T09:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.029699 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.029771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.029795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.029828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.029857 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.132714 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.132799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.132822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.132850 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.132867 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.236259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.236329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.236351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.236376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.236393 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.340147 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.340199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.340216 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.340241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.340260 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.443467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.443529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.443545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.443593 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.443611 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.451787 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.451853 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.452185 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:39:58 crc kubenswrapper[4776]: E1204 09:39:58.452390 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:39:58 crc kubenswrapper[4776]: E1204 09:39:58.452646 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:39:58 crc kubenswrapper[4776]: E1204 09:39:58.452817 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.546880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.546970 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.546995 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.547027 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.547051 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.650090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.650176 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.650202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.650237 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.650261 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.753997 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.754064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.754091 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.754124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.754150 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.857661 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.857733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.857757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.857783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.857801 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.961630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.961699 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.961716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.961749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:58 crc kubenswrapper[4776]: I1204 09:39:58.961765 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:58Z","lastTransitionTime":"2025-12-04T09:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.065172 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.065241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.065258 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.065285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.065303 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.168981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.169058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.169082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.169115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.169138 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.272315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.272369 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.272387 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.272414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.272433 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.376752 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.376872 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.376892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.376989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.377067 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.452280 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:39:59 crc kubenswrapper[4776]: E1204 09:39:59.452490 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.480504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.480571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.480589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.480610 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.480628 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.584455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.584522 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.584540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.584566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.584586 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.687937 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.688005 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.688025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.688053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.688070 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.793727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.793791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.793814 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.793846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.793872 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.897270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.897333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.897357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.897384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:39:59 crc kubenswrapper[4776]: I1204 09:39:59.897400 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:39:59Z","lastTransitionTime":"2025-12-04T09:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.001241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.001289 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.001302 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.001321 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.001332 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.104633 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.104726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.104761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.104793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.104816 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.207752 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.207809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.207823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.207841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.207857 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.310553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.310615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.310628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.310649 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.310664 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.414136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.414233 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.414260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.414291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.414314 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.451452 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.451485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.451485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:00 crc kubenswrapper[4776]: E1204 09:40:00.451678 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:00 crc kubenswrapper[4776]: E1204 09:40:00.451986 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:00 crc kubenswrapper[4776]: E1204 09:40:00.452104 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.517028 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.517095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.517112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.517137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.517151 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.619901 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.619973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.619984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.620003 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.620018 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.722999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.723053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.723063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.723082 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.723096 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.826203 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.826274 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.826285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.826308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.826321 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.928597 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.928685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.928704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.928724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:00 crc kubenswrapper[4776]: I1204 09:40:00.928741 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:00Z","lastTransitionTime":"2025-12-04T09:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.032068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.032338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.032351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.032372 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.032414 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.135555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.135613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.135625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.135646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.135659 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.239810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.239879 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.239892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.239918 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.239968 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.343103 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.343163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.343175 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.343196 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.343207 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.445863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.445963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.445991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.446026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.446045 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.452165 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:01 crc kubenswrapper[4776]: E1204 09:40:01.452297 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.549589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.549642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.549651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.549668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.549679 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.652178 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.652239 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.652256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.652278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.652292 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.757044 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.757115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.757133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.757181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.757200 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.860567 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.860630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.860648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.860674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.860691 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.963767 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.963840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.963916 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.963992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:01 crc kubenswrapper[4776]: I1204 09:40:01.964017 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:01Z","lastTransitionTime":"2025-12-04T09:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.068356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.068426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.068440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.068459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.068474 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.172404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.172471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.172488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.172514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.172531 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.275426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.275483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.275543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.275568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.275628 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.377028 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.378674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.378747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.378764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.378793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.378811 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.392221 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.397090 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.412184 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.430160 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.447769 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.451341 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.451389 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:02 crc kubenswrapper[4776]: E1204 09:40:02.451498 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.451357 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:02 crc kubenswrapper[4776]: E1204 09:40:02.451589 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:02 crc kubenswrapper[4776]: E1204 09:40:02.451653 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.460848 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.475827 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.488771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.489075 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.489148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.489228 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.489294 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.491671 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.516047 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.533109 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.551316 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.566415 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.580651 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.592578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.592642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.592662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.592691 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.592710 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.595072 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.612384 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.626132 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.645336 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:02Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.695730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.695821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.695841 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.695886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.695904 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.798889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.798955 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.798967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.798988 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.799002 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.901339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.901377 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.901385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.901401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:02 crc kubenswrapper[4776]: I1204 09:40:02.901411 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:02Z","lastTransitionTime":"2025-12-04T09:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.003615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.003660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.003680 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.003704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.003718 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.106529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.106613 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.106634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.106661 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.106679 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.209146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.209207 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.209223 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.209244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.209256 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.312068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.312137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.312156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.312185 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.312203 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.414519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.414577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.414588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.414605 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.414618 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.452291 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:03 crc kubenswrapper[4776]: E1204 09:40:03.452495 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.518104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.518160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.518172 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.518193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.518208 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.621234 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.621292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.621305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.621324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.621336 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.724798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.724959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.724984 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.725089 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.725165 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.828527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.828596 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.828617 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.828651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.828669 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.930885 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.930963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.930978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.930998 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:03 crc kubenswrapper[4776]: I1204 09:40:03.931010 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:03Z","lastTransitionTime":"2025-12-04T09:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.033664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.033743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.033764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.033792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.033811 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.138251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.138325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.138337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.138359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.138379 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.241675 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.241732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.241751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.241771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.241783 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.344360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.344402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.344416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.344434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.344445 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.447539 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.447592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.447604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.447621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.447632 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.452168 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.452233 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.452233 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:04 crc kubenswrapper[4776]: E1204 09:40:04.452429 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:04 crc kubenswrapper[4776]: E1204 09:40:04.452488 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:04 crc kubenswrapper[4776]: E1204 09:40:04.452273 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.549881 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.549968 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.550012 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.550032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.550043 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.656064 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.656140 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.656152 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.656174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.656205 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.759955 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.760023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.760037 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.760057 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.760071 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.863851 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.863900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.863909 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.863943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.863955 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.967363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.967420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.967433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.967456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:04 crc kubenswrapper[4776]: I1204 09:40:04.967471 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:04Z","lastTransitionTime":"2025-12-04T09:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.070648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.070694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.070704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.070721 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.070733 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.173892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.173963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.173978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.173996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.174007 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.276699 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.276751 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.276763 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.276781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.276794 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.379702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.379758 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.379773 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.379793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.379805 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.451715 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:05 crc kubenswrapper[4776]: E1204 09:40:05.451963 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.468250 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.482763 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.482832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.482853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.482882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.482900 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.490523 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.505631 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.519976 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.537008 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.547198 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.563255 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.575573 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.588280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.588323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.588331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.588348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.588359 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.594254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:05 crc kubenswrapper[4776]: E1204 09:40:05.594391 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:40:05 crc kubenswrapper[4776]: E1204 09:40:05.594434 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:40:21.594418647 +0000 UTC m=+66.460899024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.595211 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.613328 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.625784 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.643799 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.656962 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.670785 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.687145 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.691370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.691419 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.691428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.691448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.691461 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.702858 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.714193 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:05Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.794096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.794168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.794179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.794201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.794213 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.897168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.897261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.897295 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.897331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:05 crc kubenswrapper[4776]: I1204 09:40:05.897360 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:05Z","lastTransitionTime":"2025-12-04T09:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.000653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.000709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.000722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.000743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.000760 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.105269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.105363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.105381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.105444 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.105476 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.208515 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.208588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.208606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.208629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.208644 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.310857 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.310953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.310973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.311003 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.311020 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.402207 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.402546 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.402612 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.402647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402740 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402741 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:40:38.402695123 +0000 UTC m=+83.269175540 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402806 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402876 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402889 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402970 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.403012 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402895 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.402896 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:38.402845398 +0000 UTC m=+83.269325815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.403280 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.403324 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.403546 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:38.403530199 +0000 UTC m=+83.270010646 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.403568 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:38.40355903 +0000 UTC m=+83.270039517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.403582 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:40:38.40357441 +0000 UTC m=+83.270054787 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.413621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.413674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.413692 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.413717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.413731 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.452280 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.452460 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.452653 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.452726 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.453009 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:06 crc kubenswrapper[4776]: E1204 09:40:06.453134 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.518025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.518095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.518112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.518138 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.518156 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.621839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.621877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.621887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.621905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.621920 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.729443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.729801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.729813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.729865 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.729879 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.833337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.833386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.833401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.833425 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.833439 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.935857 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.935908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.935942 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.935963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:06 crc kubenswrapper[4776]: I1204 09:40:06.935976 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:06Z","lastTransitionTime":"2025-12-04T09:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.039151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.039205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.039217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.039236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.039247 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.142212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.142265 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.142281 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.142302 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.142314 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.245475 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.245527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.245541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.245562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.245572 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.348047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.348097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.348107 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.348125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.348135 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.451482 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.451492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.451584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.451631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.451660 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.451710 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: E1204 09:40:07.451646 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.555248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.555347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.555362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.555385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.555399 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.658740 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.658795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.658812 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.658836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.658853 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.761796 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.761840 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.761853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.761873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.761887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.865240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.865291 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.865301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.865315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.865324 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.969105 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.969208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.969232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.969259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.969279 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.978989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.979045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.979058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.979077 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.979091 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:07 crc kubenswrapper[4776]: E1204 09:40:07.994228 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:07Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.999196 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.999245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.999257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.999276 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:07 crc kubenswrapper[4776]: I1204 09:40:07.999288 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:07Z","lastTransitionTime":"2025-12-04T09:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.014726 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:08Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.021461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.021503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.021514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.021533 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.021543 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.042685 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:08Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.047149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.047202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.047211 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.047228 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.047238 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.059724 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:08Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.064264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.064311 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.064321 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.064343 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.064357 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.077674 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:08Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.077844 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.079798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.079855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.079869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.079887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.079973 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.182496 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.182552 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.182574 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.182597 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.182610 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.285352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.285400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.285411 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.285430 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.285445 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.388631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.388688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.388701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.388722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.388733 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.451785 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.451791 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.451810 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.451968 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.452231 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:08 crc kubenswrapper[4776]: E1204 09:40:08.452289 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.491799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.491856 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.491870 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.491889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.491901 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.595528 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.595577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.595588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.595607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.595616 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.702636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.702689 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.702703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.702723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.702739 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.805769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.805817 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.805829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.805845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.805855 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.909070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.909113 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.909122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.909137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:08 crc kubenswrapper[4776]: I1204 09:40:08.909148 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:08Z","lastTransitionTime":"2025-12-04T09:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.012746 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.012802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.012813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.012828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.012839 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.117271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.117310 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.117320 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.117342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.117353 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.219996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.220075 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.220088 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.220131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.220147 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.322512 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.322787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.322829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.322854 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.322870 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.425427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.425535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.425557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.425588 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.425607 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.452383 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:09 crc kubenswrapper[4776]: E1204 09:40:09.452557 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.528029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.528097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.528116 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.528142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.528163 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.631032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.631095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.631108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.631131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.631144 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.734506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.734615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.734641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.734670 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.734694 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.837304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.837382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.837412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.837446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.837472 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.941251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.941305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.941317 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.941337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:09 crc kubenswrapper[4776]: I1204 09:40:09.941350 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:09Z","lastTransitionTime":"2025-12-04T09:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.044855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.044974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.044993 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.045018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.045040 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.147790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.147833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.147844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.147863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.147874 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.250370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.250431 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.250449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.250475 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.250492 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.353344 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.353712 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.353846 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.354021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.354167 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.451783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.451955 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:10 crc kubenswrapper[4776]: E1204 09:40:10.452029 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.452115 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:10 crc kubenswrapper[4776]: E1204 09:40:10.452346 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:10 crc kubenswrapper[4776]: E1204 09:40:10.452445 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.456501 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.456616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.456755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.456829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.456894 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.559784 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.559832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.559843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.559863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.559876 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.662584 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.663213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.663304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.663379 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.663453 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.767095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.767159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.767183 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.767213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.767233 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.872847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.872896 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.872908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.872951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.872965 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.975722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.975777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.975791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.975810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:10 crc kubenswrapper[4776]: I1204 09:40:10.975822 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:10Z","lastTransitionTime":"2025-12-04T09:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.078127 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.078170 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.078180 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.078196 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.078208 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.181590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.181634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.181643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.181662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.181673 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.284468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.284512 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.284523 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.284541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.284552 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.386954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.387006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.387019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.387036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.387049 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.451425 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:11 crc kubenswrapper[4776]: E1204 09:40:11.451583 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.452469 4776 scope.go:117] "RemoveContainer" containerID="8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.489242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.489319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.489333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.489352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.489365 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.592076 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.592127 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.592144 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.592166 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.592183 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.695408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.695469 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.695483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.695506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.695519 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.798171 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.798200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.798208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.798222 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.798232 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.839868 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/1.log" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.842417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.842891 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.857658 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.870482 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.882776 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.900581 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.901201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.901235 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.901245 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.901264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.901276 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:11Z","lastTransitionTime":"2025-12-04T09:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.913084 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.924749 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.940281 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.950618 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.966025 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.979838 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:11 crc kubenswrapper[4776]: I1204 09:40:11.997809 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:11Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.003403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.003443 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.003454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.003474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.003485 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.017051 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.031733 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.044756 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.055067 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.066559 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.077019 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.105645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.105710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.105719 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.105735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.105744 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.208371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.208434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.208447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.208467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.208481 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.311208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.311282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.311297 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.311315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.311326 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.413427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.413488 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.413499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.413521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.413535 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.451985 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.452150 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:12 crc kubenswrapper[4776]: E1204 09:40:12.452332 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.452548 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:12 crc kubenswrapper[4776]: E1204 09:40:12.452617 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:12 crc kubenswrapper[4776]: E1204 09:40:12.452968 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.515619 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.515668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.515677 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.515693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.515703 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.618203 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.618251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.618264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.618283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.618297 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.721424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.721503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.721524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.721553 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.721572 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.830639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.831241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.831331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.831436 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.831527 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.848509 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/2.log" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.849575 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/1.log" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.854882 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4" exitCode=1 Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.855092 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.855169 4776 scope.go:117] "RemoveContainer" containerID="8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.858362 4776 scope.go:117] "RemoveContainer" containerID="635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4" Dec 04 09:40:12 crc kubenswrapper[4776]: E1204 09:40:12.858721 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.880639 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.898955 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.914221 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.930457 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.934787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.934855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.934874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.934895 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.934908 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:12Z","lastTransitionTime":"2025-12-04T09:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.944999 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.963030 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.976375 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.986985 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:12 crc kubenswrapper[4776]: I1204 09:40:12.998989 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:12Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.012695 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.021568 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.034880 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.037707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.037754 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.037770 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.037790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.037800 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.050242 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.065426 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.098254 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.124058 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.140371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.140410 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.140419 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.140436 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.140448 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.146035 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d2db12c958f18fc4c73828ddd0c28b1c5d016fa437b50fb832434ffcc09f383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"vn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 09:39:48.523275 6198 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:39:48.523521 6198 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.243371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.243616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.243624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.243642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.243650 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.347260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.347323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.347336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.347358 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.347375 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.450639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.450685 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.450694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.450711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.450720 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.451403 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:13 crc kubenswrapper[4776]: E1204 09:40:13.451522 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.553721 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.553771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.553785 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.553809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.553824 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.656440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.656485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.656497 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.656514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.656525 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.759657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.759702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.759722 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.759740 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.759750 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.861278 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/2.log" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.861355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.861410 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.861428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.861457 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.861476 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.865257 4776 scope.go:117] "RemoveContainer" containerID="635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4" Dec 04 09:40:13 crc kubenswrapper[4776]: E1204 09:40:13.865529 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.886507 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.903629 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.915803 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.924114 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.942125 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.954375 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.966904 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.966966 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.967127 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.967139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.967160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.967173 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:13Z","lastTransitionTime":"2025-12-04T09:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:13 crc kubenswrapper[4776]: I1204 09:40:13.987319 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:13Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.003666 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.017196 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.031036 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.042186 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.058581 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.069757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.069807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.069820 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.069843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.069856 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.069878 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.083842 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.098743 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.113363 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:14Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.173218 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.173275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.173286 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.173306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.173319 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.276835 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.276887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.276898 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.276933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.276946 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.380063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.380110 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.380125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.380144 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.380158 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.451292 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.451417 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.451416 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:14 crc kubenswrapper[4776]: E1204 09:40:14.451888 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:14 crc kubenswrapper[4776]: E1204 09:40:14.451677 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:14 crc kubenswrapper[4776]: E1204 09:40:14.451995 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.482824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.482868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.482878 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.482892 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.482901 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.585514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.586413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.586576 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.586787 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.586939 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.689197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.689242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.689251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.689270 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.689280 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.791396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.791451 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.791462 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.791482 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.791494 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.893546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.893595 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.893608 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.893642 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.893653 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.997679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.997729 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.997741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.997760 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:14 crc kubenswrapper[4776]: I1204 09:40:14.997773 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:14Z","lastTransitionTime":"2025-12-04T09:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.100956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.101031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.101042 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.101060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.101072 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.203975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.204033 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.204050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.204071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.204084 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.306998 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.307042 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.307053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.307073 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.307085 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.409246 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.409313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.409328 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.409356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.409373 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.452044 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:15 crc kubenswrapper[4776]: E1204 09:40:15.452189 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.474462 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.487960 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.504963 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.512525 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.512571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.512590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.512611 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.512626 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.518302 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.534426 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.548333 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.563138 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.574894 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.586651 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.599825 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.609882 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.614477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.614517 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.614527 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.614542 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.614551 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.624530 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.637071 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.650379 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.669766 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.684975 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.696523 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:15Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.716570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.716729 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.716863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.717032 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.717150 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.823813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.823864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.823874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.823893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.823904 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.927545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.927607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.927622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.927640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:15 crc kubenswrapper[4776]: I1204 09:40:15.927651 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:15Z","lastTransitionTime":"2025-12-04T09:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.030476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.030513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.030520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.030535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.030544 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.132935 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.132989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.133000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.133016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.133025 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.235777 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.236486 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.236500 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.236518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.236527 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.339641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.339688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.339697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.339716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.339727 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.442422 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.442486 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.442497 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.442515 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.442526 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.451890 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.451964 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.452010 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:16 crc kubenswrapper[4776]: E1204 09:40:16.452090 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:16 crc kubenswrapper[4776]: E1204 09:40:16.452237 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:16 crc kubenswrapper[4776]: E1204 09:40:16.452310 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.546016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.546093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.546104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.546125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.546142 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.648747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.648780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.648793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.648810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.648821 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.750810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.750845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.750853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.750866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.750875 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.854154 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.854579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.854600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.854620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.854631 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.957811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.957852 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.957864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.957881 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:16 crc kubenswrapper[4776]: I1204 09:40:16.957891 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:16Z","lastTransitionTime":"2025-12-04T09:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.060571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.060643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.060657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.060676 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.060686 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.170164 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.170213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.170225 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.170369 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.170386 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.272708 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.272745 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.272756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.272772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.272784 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.375893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.376066 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.376093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.376304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.376347 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.451782 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:17 crc kubenswrapper[4776]: E1204 09:40:17.451947 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.478743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.478775 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.478784 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.478799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.478808 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.582101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.582149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.582162 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.582182 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.582194 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.684752 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.684789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.684797 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.684816 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.684825 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.787404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.787447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.787455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.787470 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.787479 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.889779 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.889815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.889825 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.889842 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.889852 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.992599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.992648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.992666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.992702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:17 crc kubenswrapper[4776]: I1204 09:40:17.992754 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:17Z","lastTransitionTime":"2025-12-04T09:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.095198 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.095232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.095243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.095261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.095273 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.198653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.198718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.198730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.198750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.198763 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.301835 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.301894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.301905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.301951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.301969 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.381170 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.381520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.381631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.381778 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.382072 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.396682 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:18Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.402255 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.402331 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.402350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.402384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.402403 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.417016 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:18Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.422254 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.422319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.422332 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.422353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.422366 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.439195 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:18Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.445982 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.446141 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.446230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.446324 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.446415 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.451926 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.451996 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.452071 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.452212 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.452439 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.452541 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.463224 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:18Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.467508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.467656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.467744 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.467870 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.468226 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.481693 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:18Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:18 crc kubenswrapper[4776]: E1204 09:40:18.481830 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.483706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.483761 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.483782 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.483804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.483821 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.586996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.587303 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.587398 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.587498 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.587573 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.690340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.690384 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.690395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.690413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.690425 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.793316 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.793364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.793385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.793408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.793422 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.896148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.896197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.896207 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.896227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.896236 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:18Z","lastTransitionTime":"2025-12-04T09:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.999630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:18 crc kubenswrapper[4776]: I1204 09:40:18.999953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.000113 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.000261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.000385 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.103626 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.103696 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.103713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.103738 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.103755 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.205947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.205980 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.205988 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.206004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.206013 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.308573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.308853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.308999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.309084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.309147 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.412124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.412190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.412202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.412227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.412239 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.451707 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:19 crc kubenswrapper[4776]: E1204 09:40:19.452314 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.515292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.515356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.515372 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.515396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.515419 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.619691 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.619724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.619732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.619747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.619758 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.723102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.723162 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.723172 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.723193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.723204 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.825457 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.825503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.825515 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.825532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.825544 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.928228 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.928283 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.928292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.928312 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:19 crc kubenswrapper[4776]: I1204 09:40:19.928326 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:19Z","lastTransitionTime":"2025-12-04T09:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.031040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.031090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.031102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.031121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.031131 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.135065 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.135121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.135130 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.135150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.135198 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.238906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.238978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.238991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.239018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.239033 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.341454 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.341491 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.341503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.341521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.341534 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.444272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.444329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.444342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.444360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.444377 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.451636 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.451700 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.451636 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:20 crc kubenswrapper[4776]: E1204 09:40:20.451789 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:20 crc kubenswrapper[4776]: E1204 09:40:20.451874 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:20 crc kubenswrapper[4776]: E1204 09:40:20.452053 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.547394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.547446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.547459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.547483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.547496 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.650278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.650326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.650338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.650354 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.650364 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.753668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.753715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.753725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.753743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.753755 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.857205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.857244 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.857253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.857268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.857278 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.960282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.960322 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.960334 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.960353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:20 crc kubenswrapper[4776]: I1204 09:40:20.960366 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:20Z","lastTransitionTime":"2025-12-04T09:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.063536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.063585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.063597 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.063619 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.063632 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.166747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.166808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.166821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.166842 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.166857 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.269277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.269328 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.269339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.269358 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.269369 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.372934 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.372977 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.372986 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.373003 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.373017 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.451693 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:21 crc kubenswrapper[4776]: E1204 09:40:21.451881 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.476155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.476199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.476210 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.476227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.476237 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.579355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.579397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.579406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.579424 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.579439 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.676241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:21 crc kubenswrapper[4776]: E1204 09:40:21.676463 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:40:21 crc kubenswrapper[4776]: E1204 09:40:21.676545 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:40:53.676527454 +0000 UTC m=+98.543007831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.682275 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.682341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.682355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.682378 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.682391 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.784872 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.784939 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.784953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.784971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.784983 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.887519 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.887606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.887631 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.887663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.887682 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.990232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.990306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.990325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.990351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:21 crc kubenswrapper[4776]: I1204 09:40:21.990379 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:21Z","lastTransitionTime":"2025-12-04T09:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.093315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.093383 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.093402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.093430 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.093447 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.196648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.196690 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.196705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.196726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.196739 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.299474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.299534 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.299546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.299565 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.299579 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.402520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.402586 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.402599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.402624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.402637 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.452595 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.452718 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.452636 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:22 crc kubenswrapper[4776]: E1204 09:40:22.452859 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:22 crc kubenswrapper[4776]: E1204 09:40:22.453010 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:22 crc kubenswrapper[4776]: E1204 09:40:22.453144 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.505658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.505733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.505755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.505784 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.505806 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.608706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.608764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.608786 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.608808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.608821 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.713571 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.713638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.713650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.713674 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.713687 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.816558 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.816615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.816628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.816651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.816666 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.892270 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/0.log" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.892324 4776 generic.go:334] "Generic (PLEG): container finished" podID="423f8d5c-40c6-4efe-935f-7a9373d6becd" containerID="4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653" exitCode=1 Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.892362 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerDied","Data":"4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.892787 4776 scope.go:117] "RemoveContainer" containerID="4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.905316 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.919213 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.921361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.921404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.921415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.921440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.921451 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:22Z","lastTransitionTime":"2025-12-04T09:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.931958 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.948681 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.967091 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.981784 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:22 crc kubenswrapper[4776]: I1204 09:40:22.995941 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:22Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.011247 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.024448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.024495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.024506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.024536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.024551 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.031369 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.048313 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.066805 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.092499 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.107263 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.120299 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.126939 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.126999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.127015 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.127036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.127050 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.134847 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.146176 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.158023 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.229316 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.229347 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.229356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.229371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.229382 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.331992 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.332021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.332031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.332047 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.332061 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.434598 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.434637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.434648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.434664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.434674 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.456033 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:23 crc kubenswrapper[4776]: E1204 09:40:23.456204 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.537688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.537729 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.537746 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.537764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.537779 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.640474 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.640531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.640540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.640559 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.640570 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.743382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.743434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.743457 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.743478 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.743489 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.847152 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.847205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.847218 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.847651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.847691 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.898694 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/0.log" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.898759 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerStarted","Data":"0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.915697 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.930086 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.943365 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.950285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.950314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.950323 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.950340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.950353 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:23Z","lastTransitionTime":"2025-12-04T09:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.957729 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.971453 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.982347 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:23 crc kubenswrapper[4776]: I1204 09:40:23.998473 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:23Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.011269 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.024141 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.038805 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.048991 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.053348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.053398 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.053413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.053435 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.053449 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.071416 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.085848 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.103885 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.118424 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.132183 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.144069 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:24Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.156253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.156308 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.156318 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.156340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.156354 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.259793 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.259845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.259858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.259877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.259893 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.362762 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.362809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.362821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.362839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.362850 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.452192 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.452279 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.452192 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:24 crc kubenswrapper[4776]: E1204 09:40:24.452348 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:24 crc kubenswrapper[4776]: E1204 09:40:24.452431 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:24 crc kubenswrapper[4776]: E1204 09:40:24.452527 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.465020 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.465063 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.465080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.465099 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.465112 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.568092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.568143 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.568153 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.568172 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.568184 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.671049 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.671111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.671125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.671149 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.671165 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.773818 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.773867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.773876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.773895 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.773906 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.876389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.876447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.876458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.876477 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.876489 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.979507 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.979569 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.979579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.979598 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:24 crc kubenswrapper[4776]: I1204 09:40:24.979611 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:24Z","lastTransitionTime":"2025-12-04T09:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.082029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.082087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.082096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.082123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.082141 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.185121 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.185185 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.185197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.185221 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.185234 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.287978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.288048 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.288057 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.288073 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.288082 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.390783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.390845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.390855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.390877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.390887 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.452083 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:25 crc kubenswrapper[4776]: E1204 09:40:25.452231 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.465537 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.477119 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.486895 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.493990 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.494036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.494046 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.494066 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.494079 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.498247 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.512509 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.533002 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.549457 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.564892 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.579585 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.594907 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.596836 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.596863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.596872 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.596894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.596904 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.607979 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.622838 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.634497 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.647083 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.659633 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.673600 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.682652 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:25Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.699545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.699582 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.699590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.699606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.699615 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.802654 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.802699 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.802707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.802723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.802732 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.905704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.905739 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.905749 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.905766 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:25 crc kubenswrapper[4776]: I1204 09:40:25.905776 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:25Z","lastTransitionTime":"2025-12-04T09:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.009080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.009153 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.009167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.009190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.009201 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.111868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.111945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.111960 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.111979 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.111990 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.214731 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.214790 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.214802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.214821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.214837 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.317669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.317750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.317776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.317811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.317840 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.420881 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.420959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.420973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.420991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.421002 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.451664 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.451691 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:26 crc kubenswrapper[4776]: E1204 09:40:26.451804 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.451831 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:26 crc kubenswrapper[4776]: E1204 09:40:26.451970 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:26 crc kubenswrapper[4776]: E1204 09:40:26.452080 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.523578 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.523616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.523625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.523641 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.523651 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.626872 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.626985 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.627004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.627031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.627048 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.729607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.729653 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.729661 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.729683 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.729695 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.833081 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.833159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.833173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.833197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.833215 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.935938 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.935989 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.936003 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.936024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:26 crc kubenswrapper[4776]: I1204 09:40:26.936036 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:26Z","lastTransitionTime":"2025-12-04T09:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.038941 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.039005 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.039038 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.039058 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.039068 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.141851 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.141905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.141931 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.141949 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.141961 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.245087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.245144 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.245157 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.245181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.245195 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.347769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.347853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.347869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.347887 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.347898 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.451385 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.451427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.451469 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.451479 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.451499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.451511 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: E1204 09:40:27.451564 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.553469 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.553531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.553554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.553575 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.553590 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.656386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.656450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.656463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.656498 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.656513 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.760389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.760453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.760465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.760491 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.760509 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.862627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.862679 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.862693 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.862715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.862729 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.965868 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.965933 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.965947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.965969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:27 crc kubenswrapper[4776]: I1204 09:40:27.965984 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:27Z","lastTransitionTime":"2025-12-04T09:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.068799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.068877 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.068897 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.068940 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.068960 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.171493 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.171550 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.171562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.171605 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.171620 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.274136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.274193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.274202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.274221 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.274233 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.379640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.379703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.379725 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.379750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.379765 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.451378 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.451556 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.451404 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.451643 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.451382 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.451688 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.483401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.483446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.483458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.483476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.483488 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.586804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.586870 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.586883 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.586906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.586966 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.692018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.692081 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.692096 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.692115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.692130 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.743246 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.743307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.743317 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.743336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.743347 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.756142 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:28Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.760132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.760179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.760193 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.760212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.760225 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.771621 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:28Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.775951 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.775999 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.776010 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.776029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.776041 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.787508 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:28Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.790983 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.791029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.791042 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.791053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.791062 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.806317 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:28Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.810000 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.810039 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.810050 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.810070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.810083 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.823134 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:28Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:28 crc kubenswrapper[4776]: E1204 09:40:28.823259 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.824905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.824970 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.824981 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.824998 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.825011 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.928069 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.928163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.928182 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.928243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:28 crc kubenswrapper[4776]: I1204 09:40:28.928263 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:28Z","lastTransitionTime":"2025-12-04T09:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.031963 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.032035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.032048 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.032072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.032098 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.134651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.134713 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.134732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.134753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.134765 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.237556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.237607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.237619 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.237638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.237648 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.341180 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.341236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.341256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.341278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.341292 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.444075 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.444109 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.444122 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.444140 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.444150 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.451417 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:29 crc kubenswrapper[4776]: E1204 09:40:29.451588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.452331 4776 scope.go:117] "RemoveContainer" containerID="635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4" Dec 04 09:40:29 crc kubenswrapper[4776]: E1204 09:40:29.452496 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.547271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.547727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.547842 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.547875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.547891 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.651441 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.651495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.651506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.651526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.651538 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.754092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.754132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.754142 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.754158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.754169 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.856727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.856813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.856834 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.856874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.856900 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.959828 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.959889 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.959900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.959932 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:29 crc kubenswrapper[4776]: I1204 09:40:29.959943 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:29Z","lastTransitionTime":"2025-12-04T09:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.063007 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.063157 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.063186 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.063208 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.063219 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.165945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.165993 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.166004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.166022 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.166032 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.269586 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.269651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.269672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.269703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.269724 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.374342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.374386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.374399 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.374418 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.374429 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.452226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:30 crc kubenswrapper[4776]: E1204 09:40:30.452414 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.452269 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:30 crc kubenswrapper[4776]: E1204 09:40:30.452518 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.452238 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:30 crc kubenswrapper[4776]: E1204 09:40:30.452590 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.476304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.476362 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.476373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.476391 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.476401 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.578900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.578966 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.578978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.578998 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.579007 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.681806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.681848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.681859 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.681874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.681889 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.784678 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.784758 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.784780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.784805 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.784821 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.887540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.887589 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.887599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.887619 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.887631 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.990381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.990437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.990448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.990464 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:30 crc kubenswrapper[4776]: I1204 09:40:30.990474 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:30Z","lastTransitionTime":"2025-12-04T09:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.093028 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.093136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.093151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.093168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.093179 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.196440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.196516 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.196525 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.196546 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.196558 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.299830 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.299867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.299878 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.299895 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.299906 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.403529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.403577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.403590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.403614 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.403632 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.451547 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:31 crc kubenswrapper[4776]: E1204 09:40:31.451776 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.506528 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.506692 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.506714 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.506742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.506764 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.610337 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.610514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.610540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.610573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.610597 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.713692 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.713789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.713802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.713820 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.713833 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.816874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.816943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.816958 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.816974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.816988 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.920705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.920794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.920808 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.920826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:31 crc kubenswrapper[4776]: I1204 09:40:31.920837 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:31Z","lastTransitionTime":"2025-12-04T09:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.024026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.024084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.024095 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.024112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.024128 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.127163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.127232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.127242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.127261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.127274 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.232115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.232200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.232214 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.232236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.232251 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.334974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.335024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.335035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.335055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.335069 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.437625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.438051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.438066 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.438083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.438094 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.452130 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:32 crc kubenswrapper[4776]: E1204 09:40:32.452316 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.452154 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:32 crc kubenswrapper[4776]: E1204 09:40:32.452594 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.452751 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:32 crc kubenswrapper[4776]: E1204 09:40:32.452833 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.541639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.541687 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.541698 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.541716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.541728 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.645339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.645392 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.645402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.645428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.645442 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.748536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.748566 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.748575 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.748590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.748600 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.851867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.852017 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.852043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.852073 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.852151 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.955109 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.955150 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.955159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.955173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:32 crc kubenswrapper[4776]: I1204 09:40:32.955182 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:32Z","lastTransitionTime":"2025-12-04T09:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.057371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.057450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.057461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.057480 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.057492 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.160715 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.160780 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.160795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.160817 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.160834 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.263374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.263446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.263459 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.263485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.263499 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.366111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.366166 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.366180 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.366199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.366212 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.451676 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:33 crc kubenswrapper[4776]: E1204 09:40:33.451836 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.468669 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.468738 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.468748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.468764 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.468778 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.571348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.571414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.571425 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.571446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.571461 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.674845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.674908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.674962 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.674997 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.675025 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.777741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.777794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.777804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.777823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.777835 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.881025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.881071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.881084 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.881104 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.881117 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.983844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.983893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.983908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.983947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:33 crc kubenswrapper[4776]: I1204 09:40:33.983963 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:33Z","lastTransitionTime":"2025-12-04T09:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.087733 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.087807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.087830 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.087859 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.087879 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.194379 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.194423 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.194435 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.194453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.194467 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.298315 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.298368 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.298386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.298412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.298431 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.401621 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.401678 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.401694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.401721 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.401742 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.451274 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.451338 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:34 crc kubenswrapper[4776]: E1204 09:40:34.451450 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.451288 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:34 crc kubenswrapper[4776]: E1204 09:40:34.451588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:34 crc kubenswrapper[4776]: E1204 09:40:34.451663 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.505736 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.505797 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.505810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.505834 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.505854 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.608837 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.609066 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.609086 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.609109 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.609121 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.713336 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.713389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.713398 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.713415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.713426 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.816567 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.816628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.816643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.816662 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.816675 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.919131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.919171 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.919181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.919197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:34 crc kubenswrapper[4776]: I1204 09:40:34.919206 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:34Z","lastTransitionTime":"2025-12-04T09:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.022303 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.022476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.022500 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.022524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.022539 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.126353 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.126415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.126427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.126450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.126462 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.229465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.229529 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.229543 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.229561 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.229574 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.333811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.333956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.333994 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.334027 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.334051 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.437338 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.437398 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.437410 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.437433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.437447 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.452056 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:35 crc kubenswrapper[4776]: E1204 09:40:35.452261 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.470054 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.483638 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.494669 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.507180 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.529579 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.539380 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.539426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.539439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.539461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.539475 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.547382 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.565585 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.581038 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.594030 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.610826 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.626567 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.642372 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.642426 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.642439 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.642463 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.642475 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.645239 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.661225 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.677640 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.694463 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.708590 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.723078 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:35Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.745466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.745512 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.745522 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.745540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.745554 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.847356 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.847402 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.847411 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.847428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.847438 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.949615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.949664 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.949677 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.949701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:35 crc kubenswrapper[4776]: I1204 09:40:35.949715 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:35Z","lastTransitionTime":"2025-12-04T09:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.052394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.052438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.052450 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.052468 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.052481 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.155054 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.155133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.155148 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.155171 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.155188 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.257890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.257954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.257967 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.257987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.257999 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.362508 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.362568 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.362579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.362601 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.362613 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.451207 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.451264 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.451225 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:36 crc kubenswrapper[4776]: E1204 09:40:36.451392 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:36 crc kubenswrapper[4776]: E1204 09:40:36.451653 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:36 crc kubenswrapper[4776]: E1204 09:40:36.451782 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.465078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.465133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.465144 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.465165 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.465200 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.589798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.589862 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.589882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.589948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.589968 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.692646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.692707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.692720 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.692743 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.692755 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.796134 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.796232 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.796259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.796294 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.796318 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.899327 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.899382 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.899397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.899421 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:36 crc kubenswrapper[4776]: I1204 09:40:36.899439 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:36Z","lastTransitionTime":"2025-12-04T09:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.002750 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.002798 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.002809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.002829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.002840 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.105814 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.105864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.105876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.105898 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.105934 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.208480 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.208531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.208541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.208556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.208566 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.311318 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.311393 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.311405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.311427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.311446 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.414177 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.414251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.414268 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.414300 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.414319 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.452218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:37 crc kubenswrapper[4776]: E1204 09:40:37.452443 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.517043 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.517102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.517112 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.517133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.517145 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.624405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.625390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.625467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.625494 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.625507 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.729391 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.729442 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.729523 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.729548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.729559 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.832274 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.832317 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.832325 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.832342 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.832353 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.935416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.935471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.935481 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.935499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:37 crc kubenswrapper[4776]: I1204 09:40:37.935511 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:37Z","lastTransitionTime":"2025-12-04T09:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.038718 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.038803 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.038835 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.038873 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.038893 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.141779 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.141821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.141830 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.141847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.141858 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.245335 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.245396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.245414 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.245440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.245461 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.349634 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.349707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.349724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.349753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.349772 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.451740 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.451796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.451740 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.451947 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.452127 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.452169 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.452181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.452206 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.452213 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.452220 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.452372 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.470777 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471032 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.470957496 +0000 UTC m=+147.337437883 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.471072 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.471104 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.471136 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.471156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471237 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471283 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471310 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.471291166 +0000 UTC m=+147.337771553 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471371 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.471351298 +0000 UTC m=+147.337831685 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471399 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471423 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471396 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471473 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471488 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471557 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.471537854 +0000 UTC m=+147.338018231 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471439 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:38 crc kubenswrapper[4776]: E1204 09:40:38.471628 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.471614686 +0000 UTC m=+147.338095263 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.555124 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.555163 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.555172 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.555194 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.555212 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.657545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.657594 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.657604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.657622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.657634 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.761447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.761511 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.761531 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.761556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.761575 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.864533 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.864580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.864593 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.864610 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.864620 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.973068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.973108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.973120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.973137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:38 crc kubenswrapper[4776]: I1204 09:40:38.973151 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:38Z","lastTransitionTime":"2025-12-04T09:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.076181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.076227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.076241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.076260 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.076272 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.179649 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.179700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.179714 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.179735 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.179747 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.185240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.185294 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.185306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.185326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.185338 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.199204 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.204341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.204395 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.204412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.204437 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.204453 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.217306 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.222139 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.222191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.222203 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.222219 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.222230 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.236947 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.241747 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.241794 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.241806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.241829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.241844 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.258372 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.262483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.262520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.262530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.262551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.262563 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.302322 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:39Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.302568 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.305137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.305175 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.305190 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.305214 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.305232 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.417535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.417600 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.417615 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.417637 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.417649 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.451528 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:39 crc kubenswrapper[4776]: E1204 09:40:39.451740 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.520532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.520592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.520604 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.520624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.520636 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.623449 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.623514 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.623530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.623554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.623571 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.727499 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.727865 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.727876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.727894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.727931 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.831461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.831513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.831532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.831556 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.831575 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.934742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.934791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.934804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.934821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:39 crc kubenswrapper[4776]: I1204 09:40:39.934835 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:39Z","lastTransitionTime":"2025-12-04T09:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.037650 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.037711 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.037724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.037745 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.037757 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.140962 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.141018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.141030 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.141049 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.141070 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.244029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.244080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.244094 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.244118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.244131 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.347447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.347507 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.347520 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.347544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.347555 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451171 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451182 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451214 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451266 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451267 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:40 crc kubenswrapper[4776]: E1204 09:40:40.451402 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.451373 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:40 crc kubenswrapper[4776]: E1204 09:40:40.451599 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:40 crc kubenswrapper[4776]: E1204 09:40:40.451706 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.554639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.554690 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.554702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.554723 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.554734 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.657298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.657355 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.657365 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.657379 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.657391 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.760087 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.760136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.760151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.760174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.760189 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.863199 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.863251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.863262 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.863287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.863299 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.965296 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.965346 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.965357 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.965376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:40 crc kubenswrapper[4776]: I1204 09:40:40.965385 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:40Z","lastTransitionTime":"2025-12-04T09:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.068092 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.068147 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.068159 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.068177 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.068192 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.170866 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.170943 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.170954 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.170973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.170984 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.275485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.275548 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.275562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.275581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.275596 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.378908 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.379023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.379036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.379059 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.379070 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.452477 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:41 crc kubenswrapper[4776]: E1204 09:40:41.452662 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.483216 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.483299 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.483314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.483340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.483361 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.586540 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.586590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.586601 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.586623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.586636 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.689536 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.689591 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.689606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.689630 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.689645 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.793114 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.793161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.793174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.793192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.793206 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.898161 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.898259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.898273 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.898298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:41 crc kubenswrapper[4776]: I1204 09:40:41.898311 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:41Z","lastTransitionTime":"2025-12-04T09:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.001562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.001612 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.001627 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.001647 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.001663 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.104848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.104932 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.104946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.104968 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.104983 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.207290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.207326 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.207334 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.207350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.207361 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.309945 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.309994 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.310010 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.310031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.310045 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.413498 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.413559 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.413573 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.413697 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.413721 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.451599 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.451654 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.451655 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:42 crc kubenswrapper[4776]: E1204 09:40:42.451768 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:42 crc kubenswrapper[4776]: E1204 09:40:42.452010 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:42 crc kubenswrapper[4776]: E1204 09:40:42.452173 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.516318 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.516376 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.516390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.516410 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.516425 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.618580 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.618625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.618636 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.618652 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.618669 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.721648 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.721700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.721709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.721727 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.721738 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.824765 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.824822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.824834 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.824855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.824867 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.927180 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.927237 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.927251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.927269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:42 crc kubenswrapper[4776]: I1204 09:40:42.927281 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:42Z","lastTransitionTime":"2025-12-04T09:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.030065 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.030120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.030132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.030152 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.030165 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.133328 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.133361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.133370 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.133386 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.133396 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.236176 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.236219 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.236230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.236248 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.236262 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.338465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.338516 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.338526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.338547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.338561 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.442412 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.442472 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.442489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.442510 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.442524 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.452072 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:43 crc kubenswrapper[4776]: E1204 09:40:43.452593 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.453020 4776 scope.go:117] "RemoveContainer" containerID="635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.546544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.546601 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.546617 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.546638 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.546651 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.649572 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.649623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.649635 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.649652 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.649665 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.753118 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.753178 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.753191 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.753212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.753224 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.856397 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.856484 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.856503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.856532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.856554 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.959461 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.959512 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.959524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.959544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.959556 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:43Z","lastTransitionTime":"2025-12-04T09:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.979363 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/2.log" Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.982934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e"} Dec 04 09:40:43 crc kubenswrapper[4776]: I1204 09:40:43.983544 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.001562 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:43Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.017617 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.037472 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.054556 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.062144 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.062219 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.062228 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.062280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.062293 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.070262 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.084660 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.099593 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.115114 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.129727 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.145382 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.157887 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.166053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.166098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.166108 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.166128 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.166144 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.171285 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.188296 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.212360 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.227344 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.243995 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.260150 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.269359 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.269401 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.269409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.269428 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.269439 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.371853 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.371905 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.371935 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.371956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.371966 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.452134 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.452154 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.452186 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:44 crc kubenswrapper[4776]: E1204 09:40:44.452479 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:44 crc kubenswrapper[4776]: E1204 09:40:44.452636 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:44 crc kubenswrapper[4776]: E1204 09:40:44.452675 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.468609 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.473900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.473957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.473973 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.473987 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.473997 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.576434 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.576478 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.576487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.576506 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.576518 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.679458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.679513 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.679523 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.679545 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.679558 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.782700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.782770 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.782783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.782803 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.782814 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.884948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.885004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.885026 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.885044 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.885056 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.989160 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.989220 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.989233 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.989253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.989270 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:44Z","lastTransitionTime":"2025-12-04T09:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.989582 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/3.log" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.990581 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/2.log" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.994682 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" exitCode=1 Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.994741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e"} Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.994811 4776 scope.go:117] "RemoveContainer" containerID="635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4" Dec 04 09:40:44 crc kubenswrapper[4776]: I1204 09:40:44.995476 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:40:44 crc kubenswrapper[4776]: E1204 09:40:44.995650 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.016709 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.035838 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.053507 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.067094 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.082764 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.092070 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.092106 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.092115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.092132 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.092142 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.103580 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:44Z\\\",\\\"message\\\":\\\" Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1204 09:40:44.432292 6880 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z]\\\\nI1204 09:40:44.432072 6880 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1204 09:40:44.432306 6880 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1204 09:40:44.432051 6880 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.120974 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.137329 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.151984 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.176223 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.192206 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.194396 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.194438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.194447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.194462 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.194473 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.211075 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.224453 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.245988 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"647946bb-6cd0-4b02-93e8-c80b56116c8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f62c5b9c990faa3f9b4ab4f9dccc2a662214b9470e862c5a2afefb71fa14c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9c469b2ce66f6216401f0b7c7580885e45f733c0c483a3a07055f049a0dce78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d941b5ac6cac8f936dda95900923d9d8db40ce85886b0078d4243b3f519f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df0701a260e80ea33d6ad4a787afb26bb9fe525c1073b187e759da8759431c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c703f5e0aa21310b7d07c40c18efc6d01e991a76334706230f44b6aacc399e64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.261396 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.278687 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.295200 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.296957 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.297040 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.297054 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.297093 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.297106 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.308541 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.400559 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.400657 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.400709 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.400739 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.400757 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.451562 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:45 crc kubenswrapper[4776]: E1204 09:40:45.451797 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.471740 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.497721 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.507209 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.507348 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.507375 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.507400 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.507417 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.517619 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.539244 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.557330 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.574307 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.592282 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.611037 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.611100 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.611115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.611137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.611534 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.617320 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"647946bb-6cd0-4b02-93e8-c80b56116c8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f62c5b9c990faa3f9b4ab4f9dccc2a662214b9470e862c5a2afefb71fa14c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9c469b2ce66f6216401f0b7c7580885e45f733c0c483a3a07055f049a0dce78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d941b5ac6cac8f936dda95900923d9d8db40ce85886b0078d4243b3f519f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df0701a260e80ea33d6ad4a787afb26bb9fe525c1073b187e759da8759431c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c703f5e0aa21310b7d07c40c18efc6d01e991a76334706230f44b6aacc399e64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.631885 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.645252 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.661565 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.674122 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.687042 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.701697 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.714430 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.714495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.714505 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.714524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.714535 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.721787 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://635ce5ae2eef3994165599cb2e112f698e10f588434c1978ae660488ede9c7c4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:12Z\\\",\\\"message\\\":\\\":\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.37\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1204 09:40:12.227245 6451 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1204 09:40:12.227310 6451 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:44Z\\\",\\\"message\\\":\\\" Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1204 09:40:44.432292 6880 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z]\\\\nI1204 09:40:44.432072 6880 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1204 09:40:44.432306 6880 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1204 09:40:44.432051 6880 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.735475 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.748470 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.759535 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:45Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.817717 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.817772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.817783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.817801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.817815 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.919956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.919996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.920006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.920023 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:45 crc kubenswrapper[4776]: I1204 09:40:45.920037 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:45Z","lastTransitionTime":"2025-12-04T09:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.004905 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/3.log" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.008881 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:40:46 crc kubenswrapper[4776]: E1204 09:40:46.009048 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.021161 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ed0e05d-0d8d-4722-9f1c-d391bb980e4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d82cb224ca7e94427e3d9ff0b578d2731e0b1e3fd57005d01cfa2f50e41218a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124529cb7ef6f25b4debea2fd3431e9231f1c29ce601a97f267bf2dad663e0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2701a6aa0400de81a6ac601704de6e90c74dc08272c7026686a7e0212c5a7f91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ca901f598e371b1a416b1983de82c019daa3c966f3d8684e9df77b993ee12c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.022018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.022046 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.022055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.022071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.022084 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.035002 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.047774 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6hbgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7964ed5-5863-48f0-a329-1ff880943f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c202644f30d58aa287b40823cbca45f936f9e5ea2ffaeee9c0afa8128ef7230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jm5p9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6hbgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.069612 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc73cf8-973a-4254-9339-6c9f90c225bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:44Z\\\",\\\"message\\\":\\\" Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1204 09:40:44.432292 6880 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:44Z is after 2025-08-24T17:21:41Z]\\\\nI1204 09:40:44.432072 6880 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI1204 09:40:44.432306 6880 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1204 09:40:44.432051 6880 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q6zk4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.082289 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.095372 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.107113 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a57f7940-a976-4c85-bcb7-a1c24ba08266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed14630140f4a70eda6f433331b39248f417d7aa67dc19e3bb05d5cebbabccec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26cd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-d6wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.119867 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-l99mn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c24f72b5-b018-4505-baa0-b5c4e6066364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f72090b4cdcc99eada05d08000e57f01b071f11936f0a8492c403494c28610ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vl5c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-l99mn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.124485 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.124525 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.124537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.124554 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.124565 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.133435 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7xv6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423f8d5c-40c6-4efe-935f-7a9373d6becd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T09:40:22Z\\\",\\\"message\\\":\\\"2025-12-04T09:39:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab\\\\n2025-12-04T09:39:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_05b6a056-769b-42c9-a939-9a635e987bab to /host/opt/cni/bin/\\\\n2025-12-04T09:39:37Z [verbose] multus-daemon started\\\\n2025-12-04T09:39:37Z [verbose] Readiness Indicator file check\\\\n2025-12-04T09:40:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:40:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95p9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7xv6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.143714 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.158844 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.172106 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.186452 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.204491 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.222306 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.227071 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.227155 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.227198 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.227226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.227242 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.239376 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.259241 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"647946bb-6cd0-4b02-93e8-c80b56116c8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18f62c5b9c990faa3f9b4ab4f9dccc2a662214b9470e862c5a2afefb71fa14c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9c469b2ce66f6216401f0b7c7580885e45f733c0c483a3a07055f049a0dce78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d941b5ac6cac8f936dda95900923d9d8db40ce85886b0078d4243b3f519f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df0701a260e80ea33d6ad4a787afb26bb9fe525c1073b187e759da8759431c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c703f5e0aa21310b7d07c40c18efc6d01e991a76334706230f44b6aacc399e64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://725f833b3201959184fa01b309746c6962b9a36f54c27a5df5723237dbd8e0f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137726e0adaef3ccc2fe8f479e6348dc5074a83c83e925df0b228932c55a6b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c45b51ee7e4278d0de6a1d8068b1db01bd5132455f33a7db698559c5ecb56ff3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.270439 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8a5a83-82d6-4af1-9afa-816275ced3a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80c337edca39009a02069b3fcdd608c6a3683ef2d5a3272de6663c8c803db0b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b119452dfa29d27aba29850ed4a7fa19d507e7c5b4e038814a32a70511de1346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wp6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vk56j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:46Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.330192 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.330250 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.330266 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.330290 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.330308 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.433111 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.433181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.433204 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.433239 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.433263 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.451630 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.451664 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.451641 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:46 crc kubenswrapper[4776]: E1204 09:40:46.451811 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:46 crc kubenswrapper[4776]: E1204 09:40:46.452035 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:46 crc kubenswrapper[4776]: E1204 09:40:46.452075 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.536532 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.536802 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.536833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.536863 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.536885 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.639595 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.639666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.639684 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.639706 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.639725 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.742935 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.742985 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.742996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.743018 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.743030 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.846886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.846947 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.846959 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.846977 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.846989 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.950535 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.950590 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.950599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.950620 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:46 crc kubenswrapper[4776]: I1204 09:40:46.950631 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:46Z","lastTransitionTime":"2025-12-04T09:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.053799 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.053862 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.053876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.053901 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.053937 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.157174 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.157241 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.157253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.157278 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.157290 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.259832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.259882 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.259891 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.259910 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.259941 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.362467 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.362544 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.362562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.362586 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.362601 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.451873 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:47 crc kubenswrapper[4776]: E1204 09:40:47.452457 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.466797 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.466842 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.466870 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.466888 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.466938 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.466957 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.570363 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.570404 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.570416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.570440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.570456 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.673072 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.673125 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.673141 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.673162 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.673174 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.776776 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.776815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.776823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.776838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.776847 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.880492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.880562 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.880581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.880606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.880623 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.983183 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.983259 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.983277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.983302 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:47 crc kubenswrapper[4776]: I1204 09:40:47.983320 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:47Z","lastTransitionTime":"2025-12-04T09:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.086312 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.086360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.086371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.086389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.086403 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.189097 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.189184 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.189202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.189227 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.189245 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.291622 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.291707 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.291730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.291760 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.291778 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.394757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.394813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.394826 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.394848 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.394865 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.451535 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.451652 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.451690 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:48 crc kubenswrapper[4776]: E1204 09:40:48.452298 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:48 crc kubenswrapper[4776]: E1204 09:40:48.452374 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:48 crc kubenswrapper[4776]: E1204 09:40:48.452690 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.498898 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.499016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.499034 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.499060 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.499079 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.601792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.601869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.601884 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.601909 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.601966 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.707200 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.707285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.707298 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.707339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.707357 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.811008 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.811065 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.811078 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.811102 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.811117 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.914269 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.914361 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.914373 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.914391 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:48 crc kubenswrapper[4776]: I1204 09:40:48.914404 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:48Z","lastTransitionTime":"2025-12-04T09:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.023218 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.023271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.023285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.023307 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.023321 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.125975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.126025 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.126035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.126054 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.126065 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.228602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.228666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.228682 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.228701 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.228713 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.331586 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.331628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.331639 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.331656 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.331673 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.435065 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.435134 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.435146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.435168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.435200 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.451737 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.452274 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.537931 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.538628 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.538724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.538809 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.538883 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.612314 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.612375 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.612385 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.612406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.612422 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.627942 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.632389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.632453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.632466 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.632483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.632495 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.647131 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.651668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.651833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.652068 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.652165 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.652238 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.666022 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.670503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.670570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.670581 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.670602 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.670619 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.686536 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.693287 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.693350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.693366 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.693390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.693403 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.708666 4776 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T09:40:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19685a11-7601-4b6d-a386-4bf61b88c87c\\\",\\\"systemUUID\\\":\\\"ae4f41f6-942c-4e00-b556-5f0151068ad6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:49Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:49 crc kubenswrapper[4776]: E1204 09:40:49.708859 4776 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.716839 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.716902 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.716941 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.716961 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.716972 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.819547 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.820012 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.820115 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.820217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.820297 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.923518 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.923610 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.923624 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.923646 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:49 crc kubenswrapper[4776]: I1204 09:40:49.923663 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:49Z","lastTransitionTime":"2025-12-04T09:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.026939 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.027201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.027319 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.027407 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.027472 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.130880 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.131242 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.131306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.131383 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.131468 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.234557 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.234612 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.234623 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.234640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.234655 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.337592 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.337654 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.337668 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.337688 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.337704 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.440339 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.440381 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.440390 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.440406 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.440417 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.452212 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:50 crc kubenswrapper[4776]: E1204 09:40:50.452357 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.452400 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.452436 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:50 crc kubenswrapper[4776]: E1204 09:40:50.452507 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:50 crc kubenswrapper[4776]: E1204 09:40:50.452564 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.543829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.543890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.543903 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.543940 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.543953 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.647351 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.647403 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.647419 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.647440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.647452 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.750833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.750885 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.750897 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.750948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.750964 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.855101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.855202 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.855215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.855240 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.855253 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.958847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.958897 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.958909 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.958948 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:50 crc kubenswrapper[4776]: I1204 09:40:50.958963 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:50Z","lastTransitionTime":"2025-12-04T09:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.061579 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.061629 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.061643 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.061663 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.061675 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.164447 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.164495 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.164507 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.164524 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.164535 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.268175 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.268246 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.268271 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.268304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.268317 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.371599 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.371666 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.371684 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.371710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.371724 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.451510 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:51 crc kubenswrapper[4776]: E1204 09:40:51.451703 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.474246 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.474305 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.474317 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.474335 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.474359 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.577201 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.577261 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.577272 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.577328 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.577343 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.679907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.680004 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.680021 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.680044 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.680056 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.783168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.783217 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.783230 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.783251 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.783268 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.886153 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.886204 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.886213 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.886233 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.886246 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.989417 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.989465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.989476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.989496 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:51 crc kubenswrapper[4776]: I1204 09:40:51.989507 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:51Z","lastTransitionTime":"2025-12-04T09:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.092440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.092490 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.092504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.092521 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.092532 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.195726 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.195781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.195791 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.195813 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.195832 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.298563 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.298616 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.298625 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.298644 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.298655 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.401409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.401476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.401487 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.401511 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.401522 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.452197 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.452255 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.452275 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:52 crc kubenswrapper[4776]: E1204 09:40:52.452354 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:52 crc kubenswrapper[4776]: E1204 09:40:52.452449 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:52 crc kubenswrapper[4776]: E1204 09:40:52.452564 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.504409 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.504471 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.504483 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.504504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.504518 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.607651 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.607732 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.607755 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.607789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.607813 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.709900 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.709974 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.709991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.710011 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.710019 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.812448 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.812504 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.812517 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.812537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.812551 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.915946 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.916019 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.916031 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.916053 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:52 crc kubenswrapper[4776]: I1204 09:40:52.916067 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:52Z","lastTransitionTime":"2025-12-04T09:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.018875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.018944 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.018953 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.018975 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.018989 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.121971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.122024 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.122036 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.122051 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.122061 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.224645 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.224710 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.224730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.224753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.224767 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.327609 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.327658 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.327672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.327694 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.327707 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.430229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.430288 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.430306 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.430333 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.430352 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.451827 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:53 crc kubenswrapper[4776]: E1204 09:40:53.452085 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.532704 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.532772 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.532811 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.532844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.532869 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.635754 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.635792 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.635822 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.635838 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.635849 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.739016 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.739090 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.739101 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.739117 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.739125 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.750532 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:53 crc kubenswrapper[4776]: E1204 09:40:53.750709 4776 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:40:53 crc kubenswrapper[4776]: E1204 09:40:53.750767 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs podName:5cca4979-0471-4a2c-97ca-b6ec6fdd935d nodeName:}" failed. No retries permitted until 2025-12-04 09:41:57.75074697 +0000 UTC m=+162.617227347 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs") pod "network-metrics-daemon-g5jzd" (UID: "5cca4979-0471-4a2c-97ca-b6ec6fdd935d") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.842492 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.842537 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.842551 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.842570 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.842585 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.945806 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.945849 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.945865 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.945883 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:53 crc kubenswrapper[4776]: I1204 09:40:53.945896 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:53Z","lastTransitionTime":"2025-12-04T09:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.048526 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.048575 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.048585 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.048607 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.048619 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.151978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.152073 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.152100 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.152133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.152159 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.257757 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.257812 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.257824 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.257844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.257856 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.360236 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.360309 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.360322 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.360343 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.360358 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.451463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.451514 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:54 crc kubenswrapper[4776]: E1204 09:40:54.451660 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.451756 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:54 crc kubenswrapper[4776]: E1204 09:40:54.452013 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:54 crc kubenswrapper[4776]: E1204 09:40:54.452000 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.463867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.463964 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.463978 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.464001 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.464018 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.566672 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.566716 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.566730 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.566748 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.566759 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.669606 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.669686 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.669703 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.669724 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.669736 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.773131 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.773212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.773229 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.773253 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.773269 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.876767 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.876832 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.876858 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.876886 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.876903 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.980360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.980416 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.980427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.980446 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:54 crc kubenswrapper[4776]: I1204 09:40:54.980458 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:54Z","lastTransitionTime":"2025-12-04T09:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.082371 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.082427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.082438 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.082458 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.082470 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.185389 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.185444 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.185453 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.185476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.185489 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.288869 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.288955 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.288971 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.288991 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.289003 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.391120 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.391173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.391185 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.391206 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.391218 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.452180 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:55 crc kubenswrapper[4776]: E1204 09:40:55.452506 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.467501 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmz9v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g5jzd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.485951 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e59a3c6-f022-4e05-a66d-a763ec43e08c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T09:39:33Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1204 09:39:27.748837 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 09:39:27.750220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-378180739/tls.crt::/tmp/serving-cert-378180739/tls.key\\\\\\\"\\\\nI1204 09:39:33.636874 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 09:39:33.641086 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 09:39:33.641123 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 09:39:33.641177 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 09:39:33.641196 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 09:39:33.648520 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 09:39:33.648542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648548 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 09:39:33.648554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 09:39:33.648558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 09:39:33.648562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 09:39:33.648565 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 09:39:33.648795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 09:39:33.652963 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.494778 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.494823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.494837 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.494855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.494869 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.504846 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98aee8ac-1349-4844-8ef5-642da6177d6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019249ad49101c1f465e34223bbeaa01102dfe3fb7b854936f6ad50962157aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://009b85ca20e9e3c9e9815796b0da1cdd7c2ada9335f503e482fe85d384f172cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51579416beef8110cfe2f8e84cf63d16799b65087472f40326e590d4b7cf0c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.518483 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a77e06a9f568ae952ad61aaa2c9a338f0fad6e14d43d5c5047dad4f5c9c1435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bb2d81e4363a4c046ddf7441a56d317b39b1d655e9e31eb34eb7ac8cd9202e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.532829 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0646e1b45cf79d71e8e3080c0f61d656d45ee9bf121b81c56e05f42bec1a5e29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.545054 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://660357dddfb09858bdfd9f6d5bf7cedd9f7b9dcdb029f3677cb6c787da4fb773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.560162 4776 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"253a8526-0cc1-4441-a032-6f8f96b66f40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T09:39:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e326da76d246c6285317745b9c73fc971219ac1948343242c369759ddf4774a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T09:39:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://805149ce608826afcfa8459f1d503ac0ae4bec88e31d4658cb7675ab3c38cc37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498c16c3d095acd430982ce8ac910788fd27dec3bdf0d092f406d162a3767c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a490bfadda8fd02b63b20cd5286ee1af2c00721fbd4b69541a8abd560066b94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4018a43d1b1a7727a9f01a376a539c5b92a963141d6826b39609927b37eff28b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f278fe140d682c77a952771cc39d14153da33837322b889fd83f043249eae4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48c4a13ad1e6e3330d918c762dd0546fb4af1f2112a3e1c99df43aeda1d3000e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T09:39:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T09:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99mbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T09:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vzlvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T09:40:55Z is after 2025-08-24T17:21:41Z" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.594953 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=11.5949071 podStartE2EDuration="11.5949071s" podCreationTimestamp="2025-12-04 09:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.594154978 +0000 UTC m=+100.460635365" watchObservedRunningTime="2025-12-04 09:40:55.5949071 +0000 UTC m=+100.461387487" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.605045 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.605133 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.605147 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.605194 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.605208 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.613708 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vk56j" podStartSLOduration=80.613671607 podStartE2EDuration="1m20.613671607s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.611888653 +0000 UTC m=+100.478369040" watchObservedRunningTime="2025-12-04 09:40:55.613671607 +0000 UTC m=+100.480151984" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.628897 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.628852365 podStartE2EDuration="53.628852365s" podCreationTimestamp="2025-12-04 09:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.627909447 +0000 UTC m=+100.494389824" watchObservedRunningTime="2025-12-04 09:40:55.628852365 +0000 UTC m=+100.495332742" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.685952 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6hbgv" podStartSLOduration=80.685905478 podStartE2EDuration="1m20.685905478s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.655401517 +0000 UTC m=+100.521881914" watchObservedRunningTime="2025-12-04 09:40:55.685905478 +0000 UTC m=+100.552385856" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.708778 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.708833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.708845 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.708867 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.708890 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.719627 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.719598786 podStartE2EDuration="8.719598786s" podCreationTimestamp="2025-12-04 09:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.704728557 +0000 UTC m=+100.571208944" watchObservedRunningTime="2025-12-04 09:40:55.719598786 +0000 UTC m=+100.586079163" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.743182 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podStartSLOduration=80.743158448 podStartE2EDuration="1m20.743158448s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.743116927 +0000 UTC m=+100.609597314" watchObservedRunningTime="2025-12-04 09:40:55.743158448 +0000 UTC m=+100.609638825" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.754771 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l99mn" podStartSLOduration=80.754745437 podStartE2EDuration="1m20.754745437s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.754324245 +0000 UTC m=+100.620804632" watchObservedRunningTime="2025-12-04 09:40:55.754745437 +0000 UTC m=+100.621225814" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.811114 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.811166 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.811179 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.811198 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.811210 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.913770 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.913821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.913855 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.913874 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:55 crc kubenswrapper[4776]: I1204 09:40:55.913886 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:55Z","lastTransitionTime":"2025-12-04T09:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.016907 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.016996 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.017009 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.017029 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.017042 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.120313 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.120415 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.120427 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.120445 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.120458 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.222360 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.222420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.222433 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.222456 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.222468 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.325280 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.325341 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.325352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.325367 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.325377 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.428823 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.428875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.428890 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.428910 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.428943 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.451526 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.451534 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.451554 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:56 crc kubenswrapper[4776]: E1204 09:40:56.451810 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:56 crc kubenswrapper[4776]: E1204 09:40:56.452324 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:56 crc kubenswrapper[4776]: E1204 09:40:56.452566 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.452763 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:40:56 crc kubenswrapper[4776]: E1204 09:40:56.453002 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.532292 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.532340 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.532350 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.532369 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.532382 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.635705 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.635771 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.635783 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.635800 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.635810 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.738847 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.738893 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.738902 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.738932 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.738944 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.841312 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.841352 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.841364 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.841420 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.841434 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.945807 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.945876 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.945894 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.945961 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:56 crc kubenswrapper[4776]: I1204 09:40:56.946002 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:56Z","lastTransitionTime":"2025-12-04T09:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.048883 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.048956 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.048969 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.048997 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.049008 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.151861 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.151965 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.152006 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.152034 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.152051 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.256035 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.256141 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.256173 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.256212 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.256250 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.360197 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.360257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.360276 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.360301 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.360319 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.451519 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:57 crc kubenswrapper[4776]: E1204 09:40:57.451750 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.462157 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.462465 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.462577 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.462667 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.462746 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.565741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.565804 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.565821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.565844 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.565886 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.667906 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.668440 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.668640 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.668990 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.669196 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.772405 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.772460 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.772476 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.772503 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.772517 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.875042 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.875455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.875591 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.875756 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.875906 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.979083 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.979136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.979146 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.979167 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:57 crc kubenswrapper[4776]: I1204 09:40:57.979179 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:57Z","lastTransitionTime":"2025-12-04T09:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.081741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.081789 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.081801 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.081821 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.081832 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.184231 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.184282 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.184293 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.184309 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.184320 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.287742 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.287800 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.287810 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.287829 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.287841 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.390700 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.390741 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.390753 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.390769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.390779 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.452064 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.452129 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.452072 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:40:58 crc kubenswrapper[4776]: E1204 09:40:58.452426 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:40:58 crc kubenswrapper[4776]: E1204 09:40:58.452540 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:40:58 crc kubenswrapper[4776]: E1204 09:40:58.452575 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.493374 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.494147 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.494196 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.494224 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.494250 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.597098 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.597151 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.597168 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.597189 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.597210 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.700738 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.700781 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.700795 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.700815 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.700831 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.803316 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.803394 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.803413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.803441 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.803458 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.906205 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.906256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.906267 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.906284 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:58 crc kubenswrapper[4776]: I1204 09:40:58.906294 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:58Z","lastTransitionTime":"2025-12-04T09:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.008475 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.008555 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.008595 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.008614 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.008625 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.111055 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.111123 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.111136 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.111158 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.111170 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.214181 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.214257 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.214277 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.214304 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.214321 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.317413 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.317473 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.317489 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.317510 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.317526 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.420080 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.420137 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.420147 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.420169 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.420184 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.452161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:40:59 crc kubenswrapper[4776]: E1204 09:40:59.452360 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.522587 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.522671 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.522681 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.522702 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.522714 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.626264 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.626408 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.626422 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.626445 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.626466 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.729769 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.729833 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.729843 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.729864 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.729875 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.833455 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.833530 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.833541 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.833564 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.833578 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.938156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.938256 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.938285 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.938329 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.938359 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.945156 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.945215 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.945226 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.945243 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 09:40:59 crc kubenswrapper[4776]: I1204 09:40:59.945255 4776 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T09:40:59Z","lastTransitionTime":"2025-12-04T09:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.005173 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7xv6z" podStartSLOduration=85.005117501 podStartE2EDuration="1m25.005117501s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:40:55.771807862 +0000 UTC m=+100.638288259" watchObservedRunningTime="2025-12-04 09:41:00.005117501 +0000 UTC m=+104.871597878" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.006233 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24"] Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.006694 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.009515 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.009536 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.010660 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.010802 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.043707 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.043678616 podStartE2EDuration="1m26.043678616s" podCreationTimestamp="2025-12-04 09:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:00.042064277 +0000 UTC m=+104.908544654" watchObservedRunningTime="2025-12-04 09:41:00.043678616 +0000 UTC m=+104.910158983" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.056325 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.056299537 podStartE2EDuration="1m22.056299537s" podCreationTimestamp="2025-12-04 09:39:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:00.055714249 +0000 UTC m=+104.922194626" watchObservedRunningTime="2025-12-04 09:41:00.056299537 +0000 UTC m=+104.922779914" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.121962 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.122010 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.122078 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.122155 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.122208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.164750 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vzlvd" podStartSLOduration=85.164728641 podStartE2EDuration="1m25.164728641s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:00.149935134 +0000 UTC m=+105.016415511" watchObservedRunningTime="2025-12-04 09:41:00.164728641 +0000 UTC m=+105.031209018" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.223757 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.223857 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.223973 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.223996 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.224162 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.224891 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.224952 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.225093 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.229698 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.242862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/612b4d3e-5daf-415e-b032-4b4f97b3cbfe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6xk24\" (UID: \"612b4d3e-5daf-415e-b032-4b4f97b3cbfe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.323380 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.452160 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.452227 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:00 crc kubenswrapper[4776]: E1204 09:41:00.452626 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:00 crc kubenswrapper[4776]: E1204 09:41:00.452728 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:00 crc kubenswrapper[4776]: I1204 09:41:00.452254 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:00 crc kubenswrapper[4776]: E1204 09:41:00.452825 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:01 crc kubenswrapper[4776]: I1204 09:41:01.067006 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" event={"ID":"612b4d3e-5daf-415e-b032-4b4f97b3cbfe","Type":"ContainerStarted","Data":"4dfb5123e4b204ba09910d5b20c9576f788b86be0cb566b6e92683ca294c95b2"} Dec 04 09:41:01 crc kubenswrapper[4776]: I1204 09:41:01.067067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" event={"ID":"612b4d3e-5daf-415e-b032-4b4f97b3cbfe","Type":"ContainerStarted","Data":"9c3e6d3463fa8d31a69e0d3a2597a9dea305971e02b73ea40c3facc4d7de533b"} Dec 04 09:41:01 crc kubenswrapper[4776]: I1204 09:41:01.081178 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6xk24" podStartSLOduration=86.081152896 podStartE2EDuration="1m26.081152896s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:01.080575098 +0000 UTC m=+105.947055495" watchObservedRunningTime="2025-12-04 09:41:01.081152896 +0000 UTC m=+105.947633283" Dec 04 09:41:01 crc kubenswrapper[4776]: I1204 09:41:01.451175 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:01 crc kubenswrapper[4776]: E1204 09:41:01.451487 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:02 crc kubenswrapper[4776]: I1204 09:41:02.451604 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:02 crc kubenswrapper[4776]: I1204 09:41:02.451663 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:02 crc kubenswrapper[4776]: E1204 09:41:02.451737 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:02 crc kubenswrapper[4776]: I1204 09:41:02.451755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:02 crc kubenswrapper[4776]: E1204 09:41:02.451860 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:02 crc kubenswrapper[4776]: E1204 09:41:02.452003 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:03 crc kubenswrapper[4776]: I1204 09:41:03.451877 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:03 crc kubenswrapper[4776]: E1204 09:41:03.452071 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:04 crc kubenswrapper[4776]: I1204 09:41:04.451836 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:04 crc kubenswrapper[4776]: I1204 09:41:04.451903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:04 crc kubenswrapper[4776]: E1204 09:41:04.452012 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:04 crc kubenswrapper[4776]: I1204 09:41:04.451836 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:04 crc kubenswrapper[4776]: E1204 09:41:04.452361 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:04 crc kubenswrapper[4776]: E1204 09:41:04.452463 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:05 crc kubenswrapper[4776]: I1204 09:41:05.451961 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:05 crc kubenswrapper[4776]: E1204 09:41:05.454084 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:06 crc kubenswrapper[4776]: I1204 09:41:06.452439 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:06 crc kubenswrapper[4776]: E1204 09:41:06.452690 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:06 crc kubenswrapper[4776]: I1204 09:41:06.452835 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:06 crc kubenswrapper[4776]: E1204 09:41:06.453162 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:06 crc kubenswrapper[4776]: I1204 09:41:06.453252 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:06 crc kubenswrapper[4776]: E1204 09:41:06.453334 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:07 crc kubenswrapper[4776]: I1204 09:41:07.453242 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:07 crc kubenswrapper[4776]: E1204 09:41:07.453417 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:07 crc kubenswrapper[4776]: I1204 09:41:07.454579 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:41:07 crc kubenswrapper[4776]: E1204 09:41:07.454986 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:41:08 crc kubenswrapper[4776]: I1204 09:41:08.452210 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:08 crc kubenswrapper[4776]: I1204 09:41:08.452277 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:08 crc kubenswrapper[4776]: E1204 09:41:08.452437 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:08 crc kubenswrapper[4776]: I1204 09:41:08.452493 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:08 crc kubenswrapper[4776]: E1204 09:41:08.452646 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:08 crc kubenswrapper[4776]: E1204 09:41:08.452801 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:09 crc kubenswrapper[4776]: I1204 09:41:09.452237 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:09 crc kubenswrapper[4776]: E1204 09:41:09.452463 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.098970 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/1.log" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.099564 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/0.log" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.099607 4776 generic.go:334] "Generic (PLEG): container finished" podID="423f8d5c-40c6-4efe-935f-7a9373d6becd" containerID="0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c" exitCode=1 Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.099652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerDied","Data":"0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c"} Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.099723 4776 scope.go:117] "RemoveContainer" containerID="4ee0f9c37ed13e1936219d30216b48e16d250bf5535237ec17dc81eb4dd81653" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.101187 4776 scope.go:117] "RemoveContainer" containerID="0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c" Dec 04 09:41:10 crc kubenswrapper[4776]: E1204 09:41:10.101629 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7xv6z_openshift-multus(423f8d5c-40c6-4efe-935f-7a9373d6becd)\"" pod="openshift-multus/multus-7xv6z" podUID="423f8d5c-40c6-4efe-935f-7a9373d6becd" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.452005 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.452110 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:10 crc kubenswrapper[4776]: I1204 09:41:10.452133 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:10 crc kubenswrapper[4776]: E1204 09:41:10.452318 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:10 crc kubenswrapper[4776]: E1204 09:41:10.452671 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:10 crc kubenswrapper[4776]: E1204 09:41:10.452954 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:11 crc kubenswrapper[4776]: I1204 09:41:11.104874 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/1.log" Dec 04 09:41:11 crc kubenswrapper[4776]: I1204 09:41:11.451489 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:11 crc kubenswrapper[4776]: E1204 09:41:11.452000 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:12 crc kubenswrapper[4776]: I1204 09:41:12.451485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:12 crc kubenswrapper[4776]: I1204 09:41:12.451485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:12 crc kubenswrapper[4776]: E1204 09:41:12.451669 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:12 crc kubenswrapper[4776]: E1204 09:41:12.451802 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:12 crc kubenswrapper[4776]: I1204 09:41:12.451510 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:12 crc kubenswrapper[4776]: E1204 09:41:12.451983 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:13 crc kubenswrapper[4776]: I1204 09:41:13.452488 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:13 crc kubenswrapper[4776]: E1204 09:41:13.452715 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:14 crc kubenswrapper[4776]: I1204 09:41:14.452145 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:14 crc kubenswrapper[4776]: I1204 09:41:14.452225 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:14 crc kubenswrapper[4776]: I1204 09:41:14.452398 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:14 crc kubenswrapper[4776]: E1204 09:41:14.452308 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:14 crc kubenswrapper[4776]: E1204 09:41:14.452647 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:14 crc kubenswrapper[4776]: E1204 09:41:14.452972 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:15 crc kubenswrapper[4776]: E1204 09:41:15.416890 4776 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 04 09:41:15 crc kubenswrapper[4776]: I1204 09:41:15.451587 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:15 crc kubenswrapper[4776]: E1204 09:41:15.454137 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:15 crc kubenswrapper[4776]: E1204 09:41:15.543176 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 09:41:16 crc kubenswrapper[4776]: I1204 09:41:16.452211 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:16 crc kubenswrapper[4776]: I1204 09:41:16.452211 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:16 crc kubenswrapper[4776]: E1204 09:41:16.452845 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:16 crc kubenswrapper[4776]: I1204 09:41:16.452265 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:16 crc kubenswrapper[4776]: E1204 09:41:16.453008 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:16 crc kubenswrapper[4776]: E1204 09:41:16.453096 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:17 crc kubenswrapper[4776]: I1204 09:41:17.452288 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:17 crc kubenswrapper[4776]: E1204 09:41:17.453084 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:18 crc kubenswrapper[4776]: I1204 09:41:18.452222 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:18 crc kubenswrapper[4776]: I1204 09:41:18.452229 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:18 crc kubenswrapper[4776]: E1204 09:41:18.453842 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:18 crc kubenswrapper[4776]: I1204 09:41:18.452240 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:18 crc kubenswrapper[4776]: E1204 09:41:18.454026 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:18 crc kubenswrapper[4776]: E1204 09:41:18.454085 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:19 crc kubenswrapper[4776]: I1204 09:41:19.452372 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:19 crc kubenswrapper[4776]: E1204 09:41:19.452553 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:20 crc kubenswrapper[4776]: I1204 09:41:20.451283 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:20 crc kubenswrapper[4776]: I1204 09:41:20.451347 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:20 crc kubenswrapper[4776]: I1204 09:41:20.451291 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:20 crc kubenswrapper[4776]: E1204 09:41:20.451472 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:20 crc kubenswrapper[4776]: E1204 09:41:20.451524 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:20 crc kubenswrapper[4776]: E1204 09:41:20.451597 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:20 crc kubenswrapper[4776]: E1204 09:41:20.544826 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 09:41:21 crc kubenswrapper[4776]: I1204 09:41:21.451534 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:21 crc kubenswrapper[4776]: E1204 09:41:21.451720 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:21 crc kubenswrapper[4776]: I1204 09:41:21.452679 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:41:21 crc kubenswrapper[4776]: E1204 09:41:21.452933 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q6zk4_openshift-ovn-kubernetes(fdc73cf8-973a-4254-9339-6c9f90c225bb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" Dec 04 09:41:22 crc kubenswrapper[4776]: I1204 09:41:22.451828 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:22 crc kubenswrapper[4776]: I1204 09:41:22.451844 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:22 crc kubenswrapper[4776]: I1204 09:41:22.451974 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:22 crc kubenswrapper[4776]: E1204 09:41:22.452072 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:22 crc kubenswrapper[4776]: E1204 09:41:22.452311 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:22 crc kubenswrapper[4776]: E1204 09:41:22.452429 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:23 crc kubenswrapper[4776]: I1204 09:41:23.451437 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:23 crc kubenswrapper[4776]: E1204 09:41:23.451701 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:24 crc kubenswrapper[4776]: I1204 09:41:24.452160 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:24 crc kubenswrapper[4776]: I1204 09:41:24.452307 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:24 crc kubenswrapper[4776]: E1204 09:41:24.452370 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:24 crc kubenswrapper[4776]: I1204 09:41:24.452400 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:24 crc kubenswrapper[4776]: E1204 09:41:24.452528 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:24 crc kubenswrapper[4776]: E1204 09:41:24.452834 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:24 crc kubenswrapper[4776]: I1204 09:41:24.453129 4776 scope.go:117] "RemoveContainer" containerID="0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c" Dec 04 09:41:25 crc kubenswrapper[4776]: I1204 09:41:25.188700 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/1.log" Dec 04 09:41:25 crc kubenswrapper[4776]: I1204 09:41:25.189024 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerStarted","Data":"9eec5c6913a7c0d28163dd5e108e453b81b8bf5a5912e26cd092d47ca0f21d13"} Dec 04 09:41:25 crc kubenswrapper[4776]: I1204 09:41:25.452076 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:25 crc kubenswrapper[4776]: E1204 09:41:25.453614 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:25 crc kubenswrapper[4776]: E1204 09:41:25.545407 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 09:41:26 crc kubenswrapper[4776]: I1204 09:41:26.451862 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:26 crc kubenswrapper[4776]: I1204 09:41:26.451986 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:26 crc kubenswrapper[4776]: I1204 09:41:26.452013 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:26 crc kubenswrapper[4776]: E1204 09:41:26.452600 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:26 crc kubenswrapper[4776]: E1204 09:41:26.452891 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:26 crc kubenswrapper[4776]: E1204 09:41:26.453054 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:27 crc kubenswrapper[4776]: I1204 09:41:27.451981 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:27 crc kubenswrapper[4776]: E1204 09:41:27.452511 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:28 crc kubenswrapper[4776]: I1204 09:41:28.451720 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:28 crc kubenswrapper[4776]: I1204 09:41:28.451803 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:28 crc kubenswrapper[4776]: I1204 09:41:28.451896 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:28 crc kubenswrapper[4776]: E1204 09:41:28.451902 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:28 crc kubenswrapper[4776]: E1204 09:41:28.452068 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:28 crc kubenswrapper[4776]: E1204 09:41:28.452146 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:29 crc kubenswrapper[4776]: I1204 09:41:29.451756 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:29 crc kubenswrapper[4776]: E1204 09:41:29.451969 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:30 crc kubenswrapper[4776]: I1204 09:41:30.451226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:30 crc kubenswrapper[4776]: I1204 09:41:30.451275 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:30 crc kubenswrapper[4776]: I1204 09:41:30.451275 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:30 crc kubenswrapper[4776]: E1204 09:41:30.451392 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:30 crc kubenswrapper[4776]: E1204 09:41:30.451488 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:30 crc kubenswrapper[4776]: E1204 09:41:30.451546 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:30 crc kubenswrapper[4776]: E1204 09:41:30.547975 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 09:41:31 crc kubenswrapper[4776]: I1204 09:41:31.452190 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:31 crc kubenswrapper[4776]: E1204 09:41:31.452373 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:32 crc kubenswrapper[4776]: I1204 09:41:32.451699 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:32 crc kubenswrapper[4776]: I1204 09:41:32.451796 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:32 crc kubenswrapper[4776]: I1204 09:41:32.451802 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:32 crc kubenswrapper[4776]: E1204 09:41:32.451902 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:32 crc kubenswrapper[4776]: E1204 09:41:32.452084 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:32 crc kubenswrapper[4776]: E1204 09:41:32.452189 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:33 crc kubenswrapper[4776]: I1204 09:41:33.452331 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:33 crc kubenswrapper[4776]: E1204 09:41:33.453687 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:34 crc kubenswrapper[4776]: I1204 09:41:34.451448 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:34 crc kubenswrapper[4776]: I1204 09:41:34.451624 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:34 crc kubenswrapper[4776]: E1204 09:41:34.451668 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:34 crc kubenswrapper[4776]: E1204 09:41:34.451907 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:34 crc kubenswrapper[4776]: I1204 09:41:34.452159 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:34 crc kubenswrapper[4776]: E1204 09:41:34.452343 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:35 crc kubenswrapper[4776]: I1204 09:41:35.451977 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:35 crc kubenswrapper[4776]: E1204 09:41:35.453239 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:35 crc kubenswrapper[4776]: I1204 09:41:35.453476 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:41:35 crc kubenswrapper[4776]: E1204 09:41:35.548561 4776 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 09:41:36 crc kubenswrapper[4776]: I1204 09:41:36.452339 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:36 crc kubenswrapper[4776]: I1204 09:41:36.452395 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:36 crc kubenswrapper[4776]: E1204 09:41:36.452530 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:36 crc kubenswrapper[4776]: I1204 09:41:36.452664 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:36 crc kubenswrapper[4776]: E1204 09:41:36.452782 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:36 crc kubenswrapper[4776]: E1204 09:41:36.452833 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:37 crc kubenswrapper[4776]: I1204 09:41:37.236696 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/3.log" Dec 04 09:41:37 crc kubenswrapper[4776]: I1204 09:41:37.241059 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerStarted","Data":"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101"} Dec 04 09:41:37 crc kubenswrapper[4776]: I1204 09:41:37.242799 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:41:37 crc kubenswrapper[4776]: I1204 09:41:37.452120 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:37 crc kubenswrapper[4776]: E1204 09:41:37.452280 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:37 crc kubenswrapper[4776]: I1204 09:41:37.581376 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podStartSLOduration=122.581354426 podStartE2EDuration="2m2.581354426s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:37.280530958 +0000 UTC m=+142.147011345" watchObservedRunningTime="2025-12-04 09:41:37.581354426 +0000 UTC m=+142.447834803" Dec 04 09:41:37 crc kubenswrapper[4776]: I1204 09:41:37.581875 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g5jzd"] Dec 04 09:41:38 crc kubenswrapper[4776]: I1204 09:41:38.244885 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:38 crc kubenswrapper[4776]: E1204 09:41:38.245268 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:38 crc kubenswrapper[4776]: I1204 09:41:38.451955 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:38 crc kubenswrapper[4776]: I1204 09:41:38.452022 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:38 crc kubenswrapper[4776]: I1204 09:41:38.452146 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:38 crc kubenswrapper[4776]: E1204 09:41:38.452146 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:38 crc kubenswrapper[4776]: E1204 09:41:38.452316 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:38 crc kubenswrapper[4776]: E1204 09:41:38.452455 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.451755 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.451798 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.452057 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:40 crc kubenswrapper[4776]: E1204 09:41:40.452059 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g5jzd" podUID="5cca4979-0471-4a2c-97ca-b6ec6fdd935d" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.452171 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:40 crc kubenswrapper[4776]: E1204 09:41:40.452184 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:41:40 crc kubenswrapper[4776]: E1204 09:41:40.452294 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:41:40 crc kubenswrapper[4776]: E1204 09:41:40.452344 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.847875 4776 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.890896 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lc8p8"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.891601 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.892615 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.893258 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.899649 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-j657w"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.900845 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vm645"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.901250 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.901941 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.907383 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.908776 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mt567"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.907654 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.912185 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.907724 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.907859 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.908088 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.913550 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.908208 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.908999 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.909032 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.909228 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.909273 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.914326 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.909420 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.909639 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.910614 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.910614 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.910650 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.910906 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.911039 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.914822 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7kbz6"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.911216 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.911264 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.911318 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.911739 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.915270 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.915572 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.915839 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.926141 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.926617 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.926996 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.927551 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.927988 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-94lk2"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.929281 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.929335 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wwshr"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.930638 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.931093 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.931319 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.931419 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.931653 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.931886 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.932584 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.932871 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.933583 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.938048 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.939871 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.940035 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.951956 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.955412 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.959276 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.961232 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.961342 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcljg"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.961475 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.961548 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.974514 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.974709 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.974791 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.987784 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.988596 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.991034 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-k9crr"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.991412 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8"] Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.991536 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.993611 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 09:41:40 crc kubenswrapper[4776]: I1204 09:41:40.996737 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.011773 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.998700 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zkz5f"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.997223 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.012697 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.997542 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.997879 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.998825 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.998906 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.999151 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.999764 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.999821 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.999845 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.999880 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:40.999933 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.000285 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.001267 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.001302 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.001343 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.002477 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.002545 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.012553 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qtmtj"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.002646 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.002673 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.004858 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.005046 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.005090 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.005153 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.005205 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.015117 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.011631 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-encryption-config\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.015533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/777c909d-d188-4b2d-8939-630259436b33-audit-dir\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.015648 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.005655 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.005765 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.006072 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016199 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.006524 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.015648 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.006565 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016550 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/67de407a-108d-4477-9d23-0c60805f8ad6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.006657 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdn2\" (UniqueName: \"kubernetes.io/projected/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-kube-api-access-7pdn2\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016703 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf14439-b34b-4036-bdb0-a9197b92d3d5-serving-cert\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016742 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-etcd-client\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.008368 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.016771 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-serving-cert\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017047 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-stats-auth\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017084 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017128 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017160 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d0e41e-1a64-4157-b877-917b50f218a2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.009488 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.015564 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017397 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d737e7-e467-44ab-a18a-8e41e194e982-serving-cert\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.009752 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.011056 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017583 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017634 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017662 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-default-certificate\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-config\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017722 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkft\" (UniqueName: \"kubernetes.io/projected/34aa75c9-39fc-49eb-b338-d2b1a36535a8-kube-api-access-nlkft\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017765 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef120503-a5bc-4bde-a9ac-b461f4961766-trusted-ca\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017800 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9c9\" (UniqueName: \"kubernetes.io/projected/730c1b4c-9f80-40aa-bf4e-d6b519be241c-kube-api-access-cw9c9\") pod \"dns-operator-744455d44c-7kbz6\" (UID: \"730c1b4c-9f80-40aa-bf4e-d6b519be241c\") " pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017845 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5538449a-79a6-4f7b-aff4-4beea5729aa1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017870 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-config\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017909 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-metrics-certs\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017948 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfl92\" (UniqueName: \"kubernetes.io/projected/89d737e7-e467-44ab-a18a-8e41e194e982-kube-api-access-vfl92\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.017985 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/34aa75c9-39fc-49eb-b338-d2b1a36535a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.018201 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.018400 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.018565 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kqzwm"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.018802 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.018942 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.019040 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.019204 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lc8p8"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.019220 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.019205 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-audit-policies\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.027390 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hnv9z"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.032424 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.032581 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033023 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h4f6\" (UniqueName: \"kubernetes.io/projected/777c909d-d188-4b2d-8939-630259436b33-kube-api-access-8h4f6\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033083 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tlq4\" (UniqueName: \"kubernetes.io/projected/67de407a-108d-4477-9d23-0c60805f8ad6-kube-api-access-8tlq4\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033141 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5cgr\" (UniqueName: \"kubernetes.io/projected/de616b95-4db7-46d2-99bd-1f9cabddcb71-kube-api-access-m5cgr\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033202 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-trusted-ca-bundle\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033426 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa75c9-39fc-49eb-b338-d2b1a36535a8-config\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033468 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-config\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033552 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhsqb\" (UniqueName: \"kubernetes.io/projected/75d0e41e-1a64-4157-b877-917b50f218a2-kube-api-access-bhsqb\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033597 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-oauth-config\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033689 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-audit-policies\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-service-ca-bundle\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033854 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033946 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.033971 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5538449a-79a6-4f7b-aff4-4beea5729aa1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034055 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67de407a-108d-4477-9d23-0c60805f8ad6-serving-cert\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034078 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9adf6cb8-c061-4462-b8ee-aa3d945af0d3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-75562\" (UID: \"9adf6cb8-c061-4462-b8ee-aa3d945af0d3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034164 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034211 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc2dm\" (UniqueName: \"kubernetes.io/projected/ef120503-a5bc-4bde-a9ac-b461f4961766-kube-api-access-fc2dm\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034287 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730c1b4c-9f80-40aa-bf4e-d6b519be241c-metrics-tls\") pod \"dns-operator-744455d44c-7kbz6\" (UID: \"730c1b4c-9f80-40aa-bf4e-d6b519be241c\") " pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-client-ca\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034404 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vh65\" (UniqueName: \"kubernetes.io/projected/52d9a038-9fbd-4306-9e4a-00901ca865dc-kube-api-access-5vh65\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-serving-cert\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de616b95-4db7-46d2-99bd-1f9cabddcb71-service-ca-bundle\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034554 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/34aa75c9-39fc-49eb-b338-d2b1a36535a8-images\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034607 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034634 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034687 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d0e41e-1a64-4157-b877-917b50f218a2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034711 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-service-ca\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034732 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034778 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef120503-a5bc-4bde-a9ac-b461f4961766-config\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034800 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034887 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-877g5\" (UniqueName: \"kubernetes.io/projected/7c916477-5fc5-43cc-b409-01e423c554a2-kube-api-access-877g5\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.034909 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.037415 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.037806 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.038661 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.042411 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef120503-a5bc-4bde-a9ac-b461f4961766-serving-cert\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.042630 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmf6n\" (UniqueName: \"kubernetes.io/projected/9adf6cb8-c061-4462-b8ee-aa3d945af0d3-kube-api-access-hmf6n\") pod \"cluster-samples-operator-665b6dd947-75562\" (UID: \"9adf6cb8-c061-4462-b8ee-aa3d945af0d3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.042663 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2bv\" (UniqueName: \"kubernetes.io/projected/0cf14439-b34b-4036-bdb0-a9197b92d3d5-kube-api-access-nz2bv\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.042794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5538449a-79a6-4f7b-aff4-4beea5729aa1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.042820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-oauth-serving-cert\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.042957 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c916477-5fc5-43cc-b409-01e423c554a2-audit-dir\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.044949 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.047255 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-smxws"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.047515 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.047950 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.063659 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.063941 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.064067 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.064316 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.064523 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.064643 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.065190 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.065501 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.065801 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.066057 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.066334 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.066859 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.067886 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wt658"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068172 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068334 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068446 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068544 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068656 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068857 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.068934 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.069021 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.069265 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.069210 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.069067 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.069223 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.070665 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.071286 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.075030 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.075198 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.077131 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.082404 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.083433 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.083624 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.083825 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.083996 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.084087 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.084400 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.084523 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.084529 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.085468 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.085906 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.086141 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.086283 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.095117 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.095212 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkbvc"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.096080 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.097092 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-j657w"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.103071 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.103800 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.110377 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vm645"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.110444 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.114023 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.121003 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7kbz6"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.128619 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-94lk2"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.129936 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kqzwm"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.131462 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.132734 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.133294 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.135080 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.136909 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147321 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qtmtj"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c916477-5fc5-43cc-b409-01e423c554a2-audit-dir\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040b600-e6e9-44e3-9e26-acdc2b4f8842-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hjpj\" (UID: \"d040b600-e6e9-44e3-9e26-acdc2b4f8842\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147611 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147650 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-secret-volume\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147718 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz2bv\" (UniqueName: \"kubernetes.io/projected/0cf14439-b34b-4036-bdb0-a9197b92d3d5-kube-api-access-nz2bv\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147747 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5538449a-79a6-4f7b-aff4-4beea5729aa1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147767 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-oauth-serving-cert\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147806 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e068a6af-d8aa-4828-af79-72459f1f5525-proxy-tls\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147814 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c916477-5fc5-43cc-b409-01e423c554a2-audit-dir\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147826 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2779d861-62c4-4852-a7d9-d93a4f37c673-config\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147847 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb24f3e7-571e-472f-8945-f50d74e07994-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147887 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/777c909d-d188-4b2d-8939-630259436b33-audit-dir\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147909 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.147973 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/67de407a-108d-4477-9d23-0c60805f8ad6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148016 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148041 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvgdc\" (UniqueName: \"kubernetes.io/projected/7e5763d8-8cd7-458d-90cc-c86a99c7207a-kube-api-access-nvgdc\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148063 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148110 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e068a6af-d8aa-4828-af79-72459f1f5525-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148332 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/777c909d-d188-4b2d-8939-630259436b33-audit-dir\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-node-pullsecrets\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148455 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p72hv\" (UniqueName: \"kubernetes.io/projected/10e1f0c5-eed7-40dd-871d-aa574ed684b7-kube-api-access-p72hv\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148502 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzvv\" (UniqueName: \"kubernetes.io/projected/696d9668-ce83-427c-8b8c-cb069a6c1b26-kube-api-access-2dzvv\") pod \"control-plane-machine-set-operator-78cbb6b69f-4xnmk\" (UID: \"696d9668-ce83-427c-8b8c-cb069a6c1b26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba875c11-befa-4f9b-8475-4405e7c5e941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-stats-auth\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148743 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/67de407a-108d-4477-9d23-0c60805f8ad6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148843 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.148898 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxftm\" (UniqueName: \"kubernetes.io/projected/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-kube-api-access-rxftm\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149007 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvlrl\" (UniqueName: \"kubernetes.io/projected/d11245f8-3b53-4363-babf-6d47d9628e1b-kube-api-access-lvlrl\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149076 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-etcd-client\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149123 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-config\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149154 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mhx\" (UniqueName: \"kubernetes.io/projected/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-kube-api-access-w4mhx\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149197 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5x8c\" (UniqueName: \"kubernetes.io/projected/7909bf76-0bc7-49e8-8711-f7229c71b3eb-kube-api-access-b5x8c\") pod \"downloads-7954f5f757-k9crr\" (UID: \"7909bf76-0bc7-49e8-8711-f7229c71b3eb\") " pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149235 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-config\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149284 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c28ea18-b69e-4407-9e39-9a743bc3131c-serving-cert\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149530 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/696d9668-ce83-427c-8b8c-cb069a6c1b26-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4xnmk\" (UID: \"696d9668-ce83-427c-8b8c-cb069a6c1b26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149633 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149667 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-audit\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2779d861-62c4-4852-a7d9-d93a4f37c673-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkxf\" (UniqueName: \"kubernetes.io/projected/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-kube-api-access-8qkxf\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef120503-a5bc-4bde-a9ac-b461f4961766-trusted-ca\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149815 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.149853 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-config\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150180 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba875c11-befa-4f9b-8475-4405e7c5e941-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150232 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/009dc36d-7df7-43ff-b90e-db90aa95bb0b-signing-key\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-config\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150312 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-audit-policies\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150353 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h4f6\" (UniqueName: \"kubernetes.io/projected/777c909d-d188-4b2d-8939-630259436b33-kube-api-access-8h4f6\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tlq4\" (UniqueName: \"kubernetes.io/projected/67de407a-108d-4477-9d23-0c60805f8ad6-kube-api-access-8tlq4\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.150448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.152398 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.153567 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-image-import-ca\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.153663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa75c9-39fc-49eb-b338-d2b1a36535a8-config\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.153739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhsqb\" (UniqueName: \"kubernetes.io/projected/75d0e41e-1a64-4157-b877-917b50f218a2-kube-api-access-bhsqb\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.153786 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-oauth-config\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.153894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-encryption-config\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.153995 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5vt4\" (UniqueName: \"kubernetes.io/projected/0c28ea18-b69e-4407-9e39-9a743bc3131c-kube-api-access-n5vt4\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.154080 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10e1f0c5-eed7-40dd-871d-aa574ed684b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.154239 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9adf6cb8-c061-4462-b8ee-aa3d945af0d3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-75562\" (UID: \"9adf6cb8-c061-4462-b8ee-aa3d945af0d3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.154313 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjhq\" (UniqueName: \"kubernetes.io/projected/5f87db82-b5ae-4241-9157-006a613f8425-kube-api-access-kpjhq\") pod \"migrator-59844c95c7-dcjnl\" (UID: \"5f87db82-b5ae-4241-9157-006a613f8425\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.154383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc2dm\" (UniqueName: \"kubernetes.io/projected/ef120503-a5bc-4bde-a9ac-b461f4961766-kube-api-access-fc2dm\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.154414 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-config\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155124 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-client-ca\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155288 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-config-volume\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155350 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/34aa75c9-39fc-49eb-b338-d2b1a36535a8-images\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155383 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb24f3e7-571e-472f-8945-f50d74e07994-srv-cert\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155450 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef120503-a5bc-4bde-a9ac-b461f4961766-config\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155519 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-877g5\" (UniqueName: \"kubernetes.io/projected/7c916477-5fc5-43cc-b409-01e423c554a2-kube-api-access-877g5\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-machine-approver-tls\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155572 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-service-ca\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155599 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2779d861-62c4-4852-a7d9-d93a4f37c673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-encryption-config\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155669 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdn2\" (UniqueName: \"kubernetes.io/projected/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-kube-api-access-7pdn2\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155725 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-config\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djbdr\" (UniqueName: \"kubernetes.io/projected/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-kube-api-access-djbdr\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf14439-b34b-4036-bdb0-a9197b92d3d5-serving-cert\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155797 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kdmv\" (UniqueName: \"kubernetes.io/projected/e068a6af-d8aa-4828-af79-72459f1f5525-kube-api-access-8kdmv\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155823 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-serving-cert\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155851 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.156621 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/34aa75c9-39fc-49eb-b338-d2b1a36535a8-images\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.156851 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.156966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34aa75c9-39fc-49eb-b338-d2b1a36535a8-config\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.157594 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef120503-a5bc-4bde-a9ac-b461f4961766-config\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.158604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-client-ca\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.158927 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.159285 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5538449a-79a6-4f7b-aff4-4beea5729aa1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.160444 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-audit-policies\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.155693 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-oauth-serving-cert\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.160590 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-oauth-config\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161057 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-etcd-client\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161138 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/009dc36d-7df7-43ff-b90e-db90aa95bb0b-signing-cabundle\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161249 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d0e41e-1a64-4157-b877-917b50f218a2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161280 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161316 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d737e7-e467-44ab-a18a-8e41e194e982-serving-cert\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161340 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161525 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef120503-a5bc-4bde-a9ac-b461f4961766-trusted-ca\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.161935 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.162186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-config\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.162741 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.162813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.162935 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt85j\" (UniqueName: \"kubernetes.io/projected/80508cf3-64e7-4f5d-848f-055be21a60d2-kube-api-access-wt85j\") pod \"multus-admission-controller-857f4d67dd-kqzwm\" (UID: \"80508cf3-64e7-4f5d-848f-055be21a60d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9adf6cb8-c061-4462-b8ee-aa3d945af0d3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-75562\" (UID: \"9adf6cb8-c061-4462-b8ee-aa3d945af0d3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163284 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163343 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-serving-cert\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163482 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-config\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163587 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-default-certificate\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163667 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkft\" (UniqueName: \"kubernetes.io/projected/34aa75c9-39fc-49eb-b338-d2b1a36535a8-kube-api-access-nlkft\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163716 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10e1f0c5-eed7-40dd-871d-aa574ed684b7-tmpfs\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.163793 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.164334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxmj\" (UniqueName: \"kubernetes.io/projected/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-kube-api-access-kmxmj\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165542 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e068a6af-d8aa-4828-af79-72459f1f5525-images\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165628 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9c9\" (UniqueName: \"kubernetes.io/projected/730c1b4c-9f80-40aa-bf4e-d6b519be241c-kube-api-access-cw9c9\") pod \"dns-operator-744455d44c-7kbz6\" (UID: \"730c1b4c-9f80-40aa-bf4e-d6b519be241c\") " pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165665 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5538449a-79a6-4f7b-aff4-4beea5729aa1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165697 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwml\" (UniqueName: \"kubernetes.io/projected/1e450f38-92b1-4da3-8cb6-353756403eb6-kube-api-access-jnwml\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165722 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e5763d8-8cd7-458d-90cc-c86a99c7207a-serving-cert\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165764 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkrw\" (UniqueName: \"kubernetes.io/projected/694e1451-01cf-44d2-8d11-73468a1f0db9-kube-api-access-jkkrw\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165798 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d11245f8-3b53-4363-babf-6d47d9628e1b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165825 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/694e1451-01cf-44d2-8d11-73468a1f0db9-proxy-tls\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165857 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-metrics-certs\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165878 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfl92\" (UniqueName: \"kubernetes.io/projected/89d737e7-e467-44ab-a18a-8e41e194e982-kube-api-access-vfl92\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165905 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/34aa75c9-39fc-49eb-b338-d2b1a36535a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5cgr\" (UniqueName: \"kubernetes.io/projected/de616b95-4db7-46d2-99bd-1f9cabddcb71-kube-api-access-m5cgr\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.165977 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-audit-dir\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166004 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10e1f0c5-eed7-40dd-871d-aa574ed684b7-webhook-cert\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166032 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-trusted-ca-bundle\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166068 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-config\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166156 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws5tb\" (UniqueName: \"kubernetes.io/projected/009dc36d-7df7-43ff-b90e-db90aa95bb0b-kube-api-access-ws5tb\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166423 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.166910 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-serving-cert\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.167235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.168630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-audit-policies\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169065 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-etcd-client\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.168639 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-stats-auth\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169249 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-service-ca-bundle\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169377 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5538449a-79a6-4f7b-aff4-4beea5729aa1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169457 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-audit-policies\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169562 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-ca\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169678 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/694e1451-01cf-44d2-8d11-73468a1f0db9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169721 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169944 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-config\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.169974 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170204 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67de407a-108d-4477-9d23-0c60805f8ad6-serving-cert\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9bj\" (UniqueName: \"kubernetes.io/projected/d040b600-e6e9-44e3-9e26-acdc2b4f8842-kube-api-access-kb9bj\") pod \"package-server-manager-789f6589d5-9hjpj\" (UID: \"d040b600-e6e9-44e3-9e26-acdc2b4f8842\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170325 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-auth-proxy-config\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-serving-cert\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170376 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-client-ca\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170399 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjrgh\" (UniqueName: \"kubernetes.io/projected/cb24f3e7-571e-472f-8945-f50d74e07994-kube-api-access-tjrgh\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170451 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730c1b4c-9f80-40aa-bf4e-d6b519be241c-metrics-tls\") pod \"dns-operator-744455d44c-7kbz6\" (UID: \"730c1b4c-9f80-40aa-bf4e-d6b519be241c\") " pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170481 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-trusted-ca-bundle\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170658 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/777c909d-d188-4b2d-8939-630259436b33-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.170706 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vh65\" (UniqueName: \"kubernetes.io/projected/52d9a038-9fbd-4306-9e4a-00901ca865dc-kube-api-access-5vh65\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171054 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-config\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-service-ca-bundle\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171341 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-serving-cert\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171463 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5538449a-79a6-4f7b-aff4-4beea5729aa1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171598 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-metrics-certs\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de616b95-4db7-46d2-99bd-1f9cabddcb71-service-ca-bundle\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171657 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171697 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/34aa75c9-39fc-49eb-b338-d2b1a36535a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.171861 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89d737e7-e467-44ab-a18a-8e41e194e982-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.172322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de616b95-4db7-46d2-99bd-1f9cabddcb71-service-ca-bundle\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.172412 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.172479 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w9f2\" (UniqueName: \"kubernetes.io/projected/ba875c11-befa-4f9b-8475-4405e7c5e941-kube-api-access-8w9f2\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.172776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.172838 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d0e41e-1a64-4157-b877-917b50f218a2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.172884 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-service-ca\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-config\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173401 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmf6n\" (UniqueName: \"kubernetes.io/projected/9adf6cb8-c061-4462-b8ee-aa3d945af0d3-kube-api-access-hmf6n\") pod \"cluster-samples-operator-665b6dd947-75562\" (UID: \"9adf6cb8-c061-4462-b8ee-aa3d945af0d3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173448 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-client\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173476 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d11245f8-3b53-4363-babf-6d47d9628e1b-srv-cert\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173515 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80508cf3-64e7-4f5d-848f-055be21a60d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kqzwm\" (UID: \"80508cf3-64e7-4f5d-848f-055be21a60d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.173554 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef120503-a5bc-4bde-a9ac-b461f4961766-serving-cert\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.174044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.174198 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d0e41e-1a64-4157-b877-917b50f218a2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.175062 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-service-ca\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.175348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67de407a-108d-4477-9d23-0c60805f8ad6-serving-cert\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.175964 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-encryption-config\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.176200 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730c1b4c-9f80-40aa-bf4e-d6b519be241c-metrics-tls\") pod \"dns-operator-744455d44c-7kbz6\" (UID: \"730c1b4c-9f80-40aa-bf4e-d6b519be241c\") " pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.176207 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.176705 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.176875 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.176886 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.176979 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf14439-b34b-4036-bdb0-a9197b92d3d5-serving-cert\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.177726 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.177853 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.179076 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.179178 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c909d-d188-4b2d-8939-630259436b33-serving-cert\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.179221 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d0e41e-1a64-4157-b877-917b50f218a2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.179995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef120503-a5bc-4bde-a9ac-b461f4961766-serving-cert\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.180204 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d737e7-e467-44ab-a18a-8e41e194e982-serving-cert\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.181246 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.181609 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de616b95-4db7-46d2-99bd-1f9cabddcb71-default-certificate\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.183025 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wt658"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.184323 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.185471 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wwshr"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.186574 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.187611 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hnv9z"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.188634 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.189749 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.190712 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.191820 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pb54k"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.193027 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.193369 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.194235 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zkz5f"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.194617 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.195305 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcljg"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.196575 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-smxws"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.197647 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.200091 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dtptp"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.201190 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.201244 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zcqpr"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.202378 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.202695 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.203834 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.205169 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkbvc"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.206300 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.207414 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.208615 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.209693 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-k9crr"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.211086 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.212644 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zcqpr"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.213509 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.214499 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.214859 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pb54k"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.216115 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rwkcl"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.217159 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.217299 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rwkcl"] Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.234481 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.254423 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.275807 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.275867 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvgdc\" (UniqueName: \"kubernetes.io/projected/7e5763d8-8cd7-458d-90cc-c86a99c7207a-kube-api-access-nvgdc\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.275887 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.275987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-node-pullsecrets\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276007 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e068a6af-d8aa-4828-af79-72459f1f5525-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276030 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba875c11-befa-4f9b-8475-4405e7c5e941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p72hv\" (UniqueName: \"kubernetes.io/projected/10e1f0c5-eed7-40dd-871d-aa574ed684b7-kube-api-access-p72hv\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276084 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzvv\" (UniqueName: \"kubernetes.io/projected/696d9668-ce83-427c-8b8c-cb069a6c1b26-kube-api-access-2dzvv\") pod \"control-plane-machine-set-operator-78cbb6b69f-4xnmk\" (UID: \"696d9668-ce83-427c-8b8c-cb069a6c1b26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxftm\" (UniqueName: \"kubernetes.io/projected/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-kube-api-access-rxftm\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276148 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvlrl\" (UniqueName: \"kubernetes.io/projected/d11245f8-3b53-4363-babf-6d47d9628e1b-kube-api-access-lvlrl\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276169 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-etcd-client\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mhx\" (UniqueName: \"kubernetes.io/projected/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-kube-api-access-w4mhx\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276224 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5x8c\" (UniqueName: \"kubernetes.io/projected/7909bf76-0bc7-49e8-8711-f7229c71b3eb-kube-api-access-b5x8c\") pod \"downloads-7954f5f757-k9crr\" (UID: \"7909bf76-0bc7-49e8-8711-f7229c71b3eb\") " pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276242 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-config\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276257 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c28ea18-b69e-4407-9e39-9a743bc3131c-serving-cert\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276272 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/696d9668-ce83-427c-8b8c-cb069a6c1b26-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4xnmk\" (UID: \"696d9668-ce83-427c-8b8c-cb069a6c1b26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276333 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-audit\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276350 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2779d861-62c4-4852-a7d9-d93a4f37c673-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276384 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkxf\" (UniqueName: \"kubernetes.io/projected/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-kube-api-access-8qkxf\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276402 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba875c11-befa-4f9b-8475-4405e7c5e941-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/009dc36d-7df7-43ff-b90e-db90aa95bb0b-signing-key\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-config\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276539 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-image-import-ca\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276581 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-encryption-config\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276597 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5vt4\" (UniqueName: \"kubernetes.io/projected/0c28ea18-b69e-4407-9e39-9a743bc3131c-kube-api-access-n5vt4\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10e1f0c5-eed7-40dd-871d-aa574ed684b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjhq\" (UniqueName: \"kubernetes.io/projected/5f87db82-b5ae-4241-9157-006a613f8425-kube-api-access-kpjhq\") pod \"migrator-59844c95c7-dcjnl\" (UID: \"5f87db82-b5ae-4241-9157-006a613f8425\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276700 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb24f3e7-571e-472f-8945-f50d74e07994-srv-cert\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276719 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-config-volume\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276749 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-machine-approver-tls\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276771 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-service-ca\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276795 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2779d861-62c4-4852-a7d9-d93a4f37c673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276820 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-config\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276836 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kdmv\" (UniqueName: \"kubernetes.io/projected/e068a6af-d8aa-4828-af79-72459f1f5525-kube-api-access-8kdmv\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276856 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djbdr\" (UniqueName: \"kubernetes.io/projected/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-kube-api-access-djbdr\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276878 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276899 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/009dc36d-7df7-43ff-b90e-db90aa95bb0b-signing-cabundle\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276937 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276958 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt85j\" (UniqueName: \"kubernetes.io/projected/80508cf3-64e7-4f5d-848f-055be21a60d2-kube-api-access-wt85j\") pod \"multus-admission-controller-857f4d67dd-kqzwm\" (UID: \"80508cf3-64e7-4f5d-848f-055be21a60d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-serving-cert\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.276991 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-config\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277013 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10e1f0c5-eed7-40dd-871d-aa574ed684b7-tmpfs\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277030 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277045 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxmj\" (UniqueName: \"kubernetes.io/projected/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-kube-api-access-kmxmj\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e068a6af-d8aa-4828-af79-72459f1f5525-images\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277091 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwml\" (UniqueName: \"kubernetes.io/projected/1e450f38-92b1-4da3-8cb6-353756403eb6-kube-api-access-jnwml\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e5763d8-8cd7-458d-90cc-c86a99c7207a-serving-cert\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277126 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkrw\" (UniqueName: \"kubernetes.io/projected/694e1451-01cf-44d2-8d11-73468a1f0db9-kube-api-access-jkkrw\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d11245f8-3b53-4363-babf-6d47d9628e1b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277174 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/694e1451-01cf-44d2-8d11-73468a1f0db9-proxy-tls\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-audit-dir\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277225 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10e1f0c5-eed7-40dd-871d-aa574ed684b7-webhook-cert\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277251 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws5tb\" (UniqueName: \"kubernetes.io/projected/009dc36d-7df7-43ff-b90e-db90aa95bb0b-kube-api-access-ws5tb\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-ca\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/694e1451-01cf-44d2-8d11-73468a1f0db9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-config\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9bj\" (UniqueName: \"kubernetes.io/projected/d040b600-e6e9-44e3-9e26-acdc2b4f8842-kube-api-access-kb9bj\") pod \"package-server-manager-789f6589d5-9hjpj\" (UID: \"d040b600-e6e9-44e3-9e26-acdc2b4f8842\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277414 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-auth-proxy-config\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277436 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-serving-cert\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-client-ca\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277511 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjrgh\" (UniqueName: \"kubernetes.io/projected/cb24f3e7-571e-472f-8945-f50d74e07994-kube-api-access-tjrgh\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277536 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w9f2\" (UniqueName: \"kubernetes.io/projected/ba875c11-befa-4f9b-8475-4405e7c5e941-kube-api-access-8w9f2\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277559 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-config\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-client\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277616 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d11245f8-3b53-4363-babf-6d47d9628e1b-srv-cert\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277634 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80508cf3-64e7-4f5d-848f-055be21a60d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kqzwm\" (UID: \"80508cf3-64e7-4f5d-848f-055be21a60d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277661 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040b600-e6e9-44e3-9e26-acdc2b4f8842-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hjpj\" (UID: \"d040b600-e6e9-44e3-9e26-acdc2b4f8842\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277771 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-secret-volume\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277835 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277868 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e068a6af-d8aa-4828-af79-72459f1f5525-proxy-tls\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277886 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2779d861-62c4-4852-a7d9-d93a4f37c673-config\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277907 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb24f3e7-571e-472f-8945-f50d74e07994-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.277669 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.278760 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-node-pullsecrets\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.279271 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.279451 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-audit-dir\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.279602 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-client-ca\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.279659 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/10e1f0c5-eed7-40dd-871d-aa574ed684b7-tmpfs\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.279885 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e068a6af-d8aa-4828-af79-72459f1f5525-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.280638 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.280674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-auth-proxy-config\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.280795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/694e1451-01cf-44d2-8d11-73468a1f0db9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.281217 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-config\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.281752 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-config\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.282148 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.282322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-etcd-serving-ca\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.283683 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.284428 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-etcd-client\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.284572 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c28ea18-b69e-4407-9e39-9a743bc3131c-serving-cert\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.284891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-serving-cert\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.285077 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-machine-approver-tls\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.286821 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-encryption-config\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.294469 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.299712 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-audit\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.314785 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.322314 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-config\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.336116 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.340846 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-config\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.356056 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.360570 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-ca\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.375676 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.395122 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.414413 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.422410 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-service-ca\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.434717 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.455008 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.462325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e5763d8-8cd7-458d-90cc-c86a99c7207a-serving-cert\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.475172 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.483138 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e5763d8-8cd7-458d-90cc-c86a99c7207a-etcd-client\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.496468 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.517373 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.523099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-image-import-ca\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.535122 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.554869 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.567152 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb24f3e7-571e-472f-8945-f50d74e07994-srv-cert\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.575092 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.595732 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.601041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10e1f0c5-eed7-40dd-871d-aa574ed684b7-webhook-cert\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.606037 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10e1f0c5-eed7-40dd-871d-aa574ed684b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.615575 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.622080 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80508cf3-64e7-4f5d-848f-055be21a60d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kqzwm\" (UID: \"80508cf3-64e7-4f5d-848f-055be21a60d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.634951 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.655235 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.675142 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.679302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e068a6af-d8aa-4828-af79-72459f1f5525-images\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.694831 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.715858 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.726997 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/009dc36d-7df7-43ff-b90e-db90aa95bb0b-signing-key\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.734988 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.739050 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/009dc36d-7df7-43ff-b90e-db90aa95bb0b-signing-cabundle\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.756016 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.775618 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.783233 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/694e1451-01cf-44d2-8d11-73468a1f0db9-proxy-tls\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.794974 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.805304 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e068a6af-d8aa-4828-af79-72459f1f5525-proxy-tls\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.814573 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.835097 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.854389 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.862673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d11245f8-3b53-4363-babf-6d47d9628e1b-srv-cert\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.875167 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.895454 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.915792 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.922932 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba875c11-befa-4f9b-8475-4405e7c5e941-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.935122 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.955529 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.974513 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 09:41:41 crc kubenswrapper[4776]: I1204 09:41:41.995010 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.002109 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.019026 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.021947 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.034606 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.053711 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.072689 4776 request.go:700] Waited for 1.003318343s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dcollect-profiles-config&limit=500&resourceVersion=0 Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.074212 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.082473 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-config-volume\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.094743 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.113842 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.122712 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d11245f8-3b53-4363-babf-6d47d9628e1b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.122840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb24f3e7-571e-472f-8945-f50d74e07994-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.124058 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-secret-volume\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.134545 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.155036 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.161849 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040b600-e6e9-44e3-9e26-acdc2b4f8842-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hjpj\" (UID: \"d040b600-e6e9-44e3-9e26-acdc2b4f8842\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.178171 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.182307 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-serving-cert\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.194726 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.214503 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.219761 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-config\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.235106 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.255488 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.265710 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/696d9668-ce83-427c-8b8c-cb069a6c1b26-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4xnmk\" (UID: \"696d9668-ce83-427c-8b8c-cb069a6c1b26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.275866 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.278515 4776 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.278635 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-serving-cert podName:0be5eba5-d1a4-4c62-97c0-33ec1ffe839e nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.778606287 +0000 UTC m=+147.645086654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-serving-cert") pod "kube-apiserver-operator-766d6c64bb-pxqkt" (UID: "0be5eba5-d1a4-4c62-97c0-33ec1ffe839e") : failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.281736 4776 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.281964 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2779d861-62c4-4852-a7d9-d93a4f37c673-config podName:2779d861-62c4-4852-a7d9-d93a4f37c673 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.78193875 +0000 UTC m=+147.648419127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/2779d861-62c4-4852-a7d9-d93a4f37c673-config") pod "kube-controller-manager-operator-78b949d7b-xcpjt" (UID: "2779d861-62c4-4852-a7d9-d93a4f37c673") : failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.282225 4776 secret.go:188] Couldn't get secret openshift-controller-manager-operator/openshift-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.282380 4776 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.282471 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba875c11-befa-4f9b-8475-4405e7c5e941-serving-cert podName:ba875c11-befa-4f9b-8475-4405e7c5e941 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.782443286 +0000 UTC m=+147.648923653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ba875c11-befa-4f9b-8475-4405e7c5e941-serving-cert") pod "openshift-controller-manager-operator-756b6f6bc6-g7rfs" (UID: "ba875c11-befa-4f9b-8475-4405e7c5e941") : failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.282383 4776 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.282569 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2779d861-62c4-4852-a7d9-d93a4f37c673-serving-cert podName:2779d861-62c4-4852-a7d9-d93a4f37c673 nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.782540739 +0000 UTC m=+147.649021296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2779d861-62c4-4852-a7d9-d93a4f37c673-serving-cert") pod "kube-controller-manager-operator-78b949d7b-xcpjt" (UID: "2779d861-62c4-4852-a7d9-d93a4f37c673") : failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.282737 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-config podName:0be5eba5-d1a4-4c62-97c0-33ec1ffe839e nodeName:}" failed. No retries permitted until 2025-12-04 09:41:42.782715935 +0000 UTC m=+147.649196312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-config") pod "kube-apiserver-operator-766d6c64bb-pxqkt" (UID: "0be5eba5-d1a4-4c62-97c0-33ec1ffe839e") : failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.316209 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.335240 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.354808 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.375159 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.396019 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.415094 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.435204 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.451673 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.451766 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.451822 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.451673 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.455538 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.474971 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.495090 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.502534 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:42 crc kubenswrapper[4776]: E1204 09:41:42.502689 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:43:44.502663789 +0000 UTC m=+269.369144166 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.502995 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.503088 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.503141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.503245 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.514092 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.535485 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.555654 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.575701 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.595422 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.615655 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.634911 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.655549 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.675473 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.694644 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.731944 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhsqb\" (UniqueName: \"kubernetes.io/projected/75d0e41e-1a64-4157-b877-917b50f218a2-kube-api-access-bhsqb\") pod \"openshift-apiserver-operator-796bbdcf4f-hv2mg\" (UID: \"75d0e41e-1a64-4157-b877-917b50f218a2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.751698 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tlq4\" (UniqueName: \"kubernetes.io/projected/67de407a-108d-4477-9d23-0c60805f8ad6-kube-api-access-8tlq4\") pod \"openshift-config-operator-7777fb866f-x7mp5\" (UID: \"67de407a-108d-4477-9d23-0c60805f8ad6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.776618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-877g5\" (UniqueName: \"kubernetes.io/projected/7c916477-5fc5-43cc-b409-01e423c554a2-kube-api-access-877g5\") pod \"oauth-openshift-558db77b4-94lk2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.790719 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc2dm\" (UniqueName: \"kubernetes.io/projected/ef120503-a5bc-4bde-a9ac-b461f4961766-kube-api-access-fc2dm\") pod \"console-operator-58897d9998-j657w\" (UID: \"ef120503-a5bc-4bde-a9ac-b461f4961766\") " pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.806758 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.806895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2779d861-62c4-4852-a7d9-d93a4f37c673-config\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.806983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba875c11-befa-4f9b-8475-4405e7c5e941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.807194 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2779d861-62c4-4852-a7d9-d93a4f37c673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.807230 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-config\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.807732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2779d861-62c4-4852-a7d9-d93a4f37c673-config\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.808045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-config\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.810552 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba875c11-befa-4f9b-8475-4405e7c5e941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.810973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.812004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz2bv\" (UniqueName: \"kubernetes.io/projected/0cf14439-b34b-4036-bdb0-a9197b92d3d5-kube-api-access-nz2bv\") pod \"route-controller-manager-6576b87f9c-g96md\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.812365 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2779d861-62c4-4852-a7d9-d93a4f37c673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.831904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdn2\" (UniqueName: \"kubernetes.io/projected/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-kube-api-access-7pdn2\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.852259 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h4f6\" (UniqueName: \"kubernetes.io/projected/777c909d-d188-4b2d-8939-630259436b33-kube-api-access-8h4f6\") pod \"apiserver-7bbb656c7d-jjmzc\" (UID: \"777c909d-d188-4b2d-8939-630259436b33\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.872864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkft\" (UniqueName: \"kubernetes.io/projected/34aa75c9-39fc-49eb-b338-d2b1a36535a8-kube-api-access-nlkft\") pod \"machine-api-operator-5694c8668f-lc8p8\" (UID: \"34aa75c9-39fc-49eb-b338-d2b1a36535a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.877280 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.892383 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfl92\" (UniqueName: \"kubernetes.io/projected/89d737e7-e467-44ab-a18a-8e41e194e982-kube-api-access-vfl92\") pod \"authentication-operator-69f744f599-wwshr\" (UID: \"89d737e7-e467-44ab-a18a-8e41e194e982\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.902381 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.911262 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9c9\" (UniqueName: \"kubernetes.io/projected/730c1b4c-9f80-40aa-bf4e-d6b519be241c-kube-api-access-cw9c9\") pod \"dns-operator-744455d44c-7kbz6\" (UID: \"730c1b4c-9f80-40aa-bf4e-d6b519be241c\") " pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.921652 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.931591 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.939423 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.940613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5538449a-79a6-4f7b-aff4-4beea5729aa1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vlj8w\" (UID: \"5538449a-79a6-4f7b-aff4-4beea5729aa1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.951698 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5cgr\" (UniqueName: \"kubernetes.io/projected/de616b95-4db7-46d2-99bd-1f9cabddcb71-kube-api-access-m5cgr\") pod \"router-default-5444994796-mt567\" (UID: \"de616b95-4db7-46d2-99bd-1f9cabddcb71\") " pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.955002 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:42 crc kubenswrapper[4776]: I1204 09:41:42.974721 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vh65\" (UniqueName: \"kubernetes.io/projected/52d9a038-9fbd-4306-9e4a-00901ca865dc-kube-api-access-5vh65\") pod \"console-f9d7485db-vm645\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:42.999891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mwg2x\" (UID: \"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.024195 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.029375 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.030883 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.034899 4776 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.035638 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmf6n\" (UniqueName: \"kubernetes.io/projected/9adf6cb8-c061-4462-b8ee-aa3d945af0d3-kube-api-access-hmf6n\") pod \"cluster-samples-operator-665b6dd947-75562\" (UID: \"9adf6cb8-c061-4462-b8ee-aa3d945af0d3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.038566 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.051562 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.055615 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.077530 4776 request.go:700] Waited for 1.883829795s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.082370 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.094278 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.105707 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.116247 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.135046 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.155749 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.175292 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.195663 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.207055 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.220653 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.237781 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.257602 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.280169 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.318169 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.340084 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djbdr\" (UniqueName: \"kubernetes.io/projected/310d7c28-b320-4b2d-a53c-2a3e097ec4c1-kube-api-access-djbdr\") pod \"machine-approver-56656f9798-thfjz\" (UID: \"310d7c28-b320-4b2d-a53c-2a3e097ec4c1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.350286 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kdmv\" (UniqueName: \"kubernetes.io/projected/e068a6af-d8aa-4828-af79-72459f1f5525-kube-api-access-8kdmv\") pod \"machine-config-operator-74547568cd-kk2fd\" (UID: \"e068a6af-d8aa-4828-af79-72459f1f5525\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.352577 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w9f2\" (UniqueName: \"kubernetes.io/projected/ba875c11-befa-4f9b-8475-4405e7c5e941-kube-api-access-8w9f2\") pod \"openshift-controller-manager-operator-756b6f6bc6-g7rfs\" (UID: \"ba875c11-befa-4f9b-8475-4405e7c5e941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.363502 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5"] Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.372243 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.377818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws5tb\" (UniqueName: \"kubernetes.io/projected/009dc36d-7df7-43ff-b90e-db90aa95bb0b-kube-api-access-ws5tb\") pod \"service-ca-9c57cc56f-hnv9z\" (UID: \"009dc36d-7df7-43ff-b90e-db90aa95bb0b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.404700 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt85j\" (UniqueName: \"kubernetes.io/projected/80508cf3-64e7-4f5d-848f-055be21a60d2-kube-api-access-wt85j\") pod \"multus-admission-controller-857f4d67dd-kqzwm\" (UID: \"80508cf3-64e7-4f5d-848f-055be21a60d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.417726 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg"] Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.422095 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkrw\" (UniqueName: \"kubernetes.io/projected/694e1451-01cf-44d2-8d11-73468a1f0db9-kube-api-access-jkkrw\") pod \"machine-config-controller-84d6567774-sdtzn\" (UID: \"694e1451-01cf-44d2-8d11-73468a1f0db9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.447748 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.452057 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwml\" (UniqueName: \"kubernetes.io/projected/1e450f38-92b1-4da3-8cb6-353756403eb6-kube-api-access-jnwml\") pod \"marketplace-operator-79b997595-smxws\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.454310 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.469668 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.479716 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.484119 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.490210 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.493608 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.503576 4776 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.503729 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:45.503694662 +0000 UTC m=+270.370175039 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.504020 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.504111 4776 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.504169 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:45.504158555 +0000 UTC m=+270.370638932 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.504217 4776 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.505832 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjrgh\" (UniqueName: \"kubernetes.io/projected/cb24f3e7-571e-472f-8945-f50d74e07994-kube-api-access-tjrgh\") pod \"catalog-operator-68c6474976-dgsdv\" (UID: \"cb24f3e7-571e-472f-8945-f50d74e07994\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.537637 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9bj\" (UniqueName: \"kubernetes.io/projected/d040b600-e6e9-44e3-9e26-acdc2b4f8842-kube-api-access-kb9bj\") pod \"package-server-manager-789f6589d5-9hjpj\" (UID: \"d040b600-e6e9-44e3-9e26-acdc2b4f8842\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.538463 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvgdc\" (UniqueName: \"kubernetes.io/projected/7e5763d8-8cd7-458d-90cc-c86a99c7207a-kube-api-access-nvgdc\") pod \"etcd-operator-b45778765-qtmtj\" (UID: \"7e5763d8-8cd7-458d-90cc-c86a99c7207a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.538826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxmj\" (UniqueName: \"kubernetes.io/projected/3b021ce5-71f1-4fac-9096-e9a6e8e820c3-kube-api-access-kmxmj\") pod \"apiserver-76f77b778f-vcljg\" (UID: \"3b021ce5-71f1-4fac-9096-e9a6e8e820c3\") " pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.566535 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0be5eba5-d1a4-4c62-97c0-33ec1ffe839e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pxqkt\" (UID: \"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.586688 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mhx\" (UniqueName: \"kubernetes.io/projected/2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0-kube-api-access-w4mhx\") pod \"ingress-operator-5b745b69d9-xf9p8\" (UID: \"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.596813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5vt4\" (UniqueName: \"kubernetes.io/projected/0c28ea18-b69e-4407-9e39-9a743bc3131c-kube-api-access-n5vt4\") pod \"controller-manager-879f6c89f-zkz5f\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.614497 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2779d861-62c4-4852-a7d9-d93a4f37c673-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xcpjt\" (UID: \"2779d861-62c4-4852-a7d9-d93a4f37c673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.641532 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5x8c\" (UniqueName: \"kubernetes.io/projected/7909bf76-0bc7-49e8-8711-f7229c71b3eb-kube-api-access-b5x8c\") pod \"downloads-7954f5f757-k9crr\" (UID: \"7909bf76-0bc7-49e8-8711-f7229c71b3eb\") " pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.648010 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.656484 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.658870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p72hv\" (UniqueName: \"kubernetes.io/projected/10e1f0c5-eed7-40dd-871d-aa574ed684b7-kube-api-access-p72hv\") pod \"packageserver-d55dfcdfc-lxr6t\" (UID: \"10e1f0c5-eed7-40dd-871d-aa574ed684b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.668569 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.681641 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.682629 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzvv\" (UniqueName: \"kubernetes.io/projected/696d9668-ce83-427c-8b8c-cb069a6c1b26-kube-api-access-2dzvv\") pod \"control-plane-machine-set-operator-78cbb6b69f-4xnmk\" (UID: \"696d9668-ce83-427c-8b8c-cb069a6c1b26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.708360 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.709767 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxftm\" (UniqueName: \"kubernetes.io/projected/d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff-kube-api-access-rxftm\") pod \"service-ca-operator-777779d784-wt658\" (UID: \"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.726861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.731360 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7kbz6"] Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.731726 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.738831 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjhq\" (UniqueName: \"kubernetes.io/projected/5f87db82-b5ae-4241-9157-006a613f8425-kube-api-access-kpjhq\") pod \"migrator-59844c95c7-dcjnl\" (UID: \"5f87db82-b5ae-4241-9157-006a613f8425\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.753814 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkxf\" (UniqueName: \"kubernetes.io/projected/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-kube-api-access-8qkxf\") pod \"collect-profiles-29414010-bp48f\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.760148 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvlrl\" (UniqueName: \"kubernetes.io/projected/d11245f8-3b53-4363-babf-6d47d9628e1b-kube-api-access-lvlrl\") pod \"olm-operator-6b444d44fb-97ppf\" (UID: \"d11245f8-3b53-4363-babf-6d47d9628e1b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.774507 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc"] Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.783290 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.784458 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-94lk2"] Dec 04 09:41:43 crc kubenswrapper[4776]: W1204 09:41:43.798955 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730c1b4c_9f80_40aa_bf4e_d6b519be241c.slice/crio-b3bec272c4afb7639f6591b06a02fdfd03a9f01078248d358db033c405d86284 WatchSource:0}: Error finding container b3bec272c4afb7639f6591b06a02fdfd03a9f01078248d358db033c405d86284: Status 404 returned error can't find the container with id b3bec272c4afb7639f6591b06a02fdfd03a9f01078248d358db033c405d86284 Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.799197 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md"] Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.804713 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.811206 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.814613 4776 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.814765 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:45.814731604 +0000 UTC m=+270.681211981 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.815556 4776 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.815708 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 09:43:45.815652383 +0000 UTC m=+270.682132760 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.819451 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.819500 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.830625 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.835753 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.840746 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.847352 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.849736 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.858728 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.876061 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.937500 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54c3ab7c-9388-45a8-b328-83e4e841ca91-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.937541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-socket-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.937740 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-csi-data-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.937773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e358131f-46f1-40bc-9a4a-93798e8a303d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.937869 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-registration-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-mountpoint-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938255 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-trusted-ca\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938278 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e358131f-46f1-40bc-9a4a-93798e8a303d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938337 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c3ab7c-9388-45a8-b328-83e4e841ca91-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938378 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9nmc\" (UniqueName: \"kubernetes.io/projected/54c3ab7c-9388-45a8-b328-83e4e841ca91-kube-api-access-b9nmc\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938676 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxkp\" (UniqueName: \"kubernetes.io/projected/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-kube-api-access-nnxkp\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938719 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-tls\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-plugins-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.938966 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-bound-sa-token\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.939036 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-certificates\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: I1204 09:41:43.939059 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgr2f\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-kube-api-access-bgr2f\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:43 crc kubenswrapper[4776]: E1204 09:41:43.946878 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.446842092 +0000 UTC m=+149.313322469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.015645 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.038769 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.039691 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.040170 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.540124408 +0000 UTC m=+149.406605015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.040275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-trusted-ca\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.040390 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/321cb82e-9541-4610-9c23-23cde300fe6a-node-bootstrap-token\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.040434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e358131f-46f1-40bc-9a4a-93798e8a303d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.040518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c3ab7c-9388-45a8-b328-83e4e841ca91-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.041041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9nmc\" (UniqueName: \"kubernetes.io/projected/54c3ab7c-9388-45a8-b328-83e4e841ca91-kube-api-access-b9nmc\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.041203 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.041636 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.541618354 +0000 UTC m=+149.408098911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.041672 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-config-volume\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.042309 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv8m6\" (UniqueName: \"kubernetes.io/projected/aa3d6c58-9a36-4ec7-b450-89f39ca4d772-kube-api-access-mv8m6\") pod \"ingress-canary-zcqpr\" (UID: \"aa3d6c58-9a36-4ec7-b450-89f39ca4d772\") " pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.042326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-trusted-ca\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.042352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa3d6c58-9a36-4ec7-b450-89f39ca4d772-cert\") pod \"ingress-canary-zcqpr\" (UID: \"aa3d6c58-9a36-4ec7-b450-89f39ca4d772\") " pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.043088 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksd54\" (UniqueName: \"kubernetes.io/projected/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-kube-api-access-ksd54\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.043318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-tls\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.043378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxkp\" (UniqueName: \"kubernetes.io/projected/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-kube-api-access-nnxkp\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.043786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-metrics-tls\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.043837 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/321cb82e-9541-4610-9c23-23cde300fe6a-certs\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044042 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-plugins-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-bound-sa-token\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-certificates\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgr2f\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-kube-api-access-bgr2f\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044285 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54c3ab7c-9388-45a8-b328-83e4e841ca91-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044305 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-socket-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lw6x\" (UniqueName: \"kubernetes.io/projected/321cb82e-9541-4610-9c23-23cde300fe6a-kube-api-access-6lw6x\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044419 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-csi-data-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044438 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e358131f-46f1-40bc-9a4a-93798e8a303d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044481 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-registration-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.044574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-mountpoint-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.047828 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-plugins-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.048645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-socket-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.049644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-csi-data-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.050116 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e358131f-46f1-40bc-9a4a-93798e8a303d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.050632 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-registration-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.050908 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54c3ab7c-9388-45a8-b328-83e4e841ca91-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.051060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-mountpoint-dir\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.057869 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c3ab7c-9388-45a8-b328-83e4e841ca91-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.058439 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-certificates\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.062180 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wwshr"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.070868 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-tls\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.076847 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e358131f-46f1-40bc-9a4a-93798e8a303d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.088638 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.090088 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.094438 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lc8p8"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.101815 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9nmc\" (UniqueName: \"kubernetes.io/projected/54c3ab7c-9388-45a8-b328-83e4e841ca91-kube-api-access-b9nmc\") pod \"kube-storage-version-migrator-operator-b67b599dd-lv9hp\" (UID: \"54c3ab7c-9388-45a8-b328-83e4e841ca91\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.115330 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-j657w"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.120022 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxkp\" (UniqueName: \"kubernetes.io/projected/6e6e7228-bf39-47ac-ab7a-cf61cb5112c1-kube-api-access-nnxkp\") pod \"csi-hostpathplugin-pb54k\" (UID: \"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1\") " pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.131850 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-bound-sa-token\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.146332 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.146863 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lw6x\" (UniqueName: \"kubernetes.io/projected/321cb82e-9541-4610-9c23-23cde300fe6a-kube-api-access-6lw6x\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.146948 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/321cb82e-9541-4610-9c23-23cde300fe6a-node-bootstrap-token\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.147011 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-config-volume\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.147048 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv8m6\" (UniqueName: \"kubernetes.io/projected/aa3d6c58-9a36-4ec7-b450-89f39ca4d772-kube-api-access-mv8m6\") pod \"ingress-canary-zcqpr\" (UID: \"aa3d6c58-9a36-4ec7-b450-89f39ca4d772\") " pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.147072 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa3d6c58-9a36-4ec7-b450-89f39ca4d772-cert\") pod \"ingress-canary-zcqpr\" (UID: \"aa3d6c58-9a36-4ec7-b450-89f39ca4d772\") " pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.147093 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksd54\" (UniqueName: \"kubernetes.io/projected/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-kube-api-access-ksd54\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.147116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-metrics-tls\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.147133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/321cb82e-9541-4610-9c23-23cde300fe6a-certs\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.147659 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.647615934 +0000 UTC m=+149.514096441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.165323 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-config-volume\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.169186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/321cb82e-9541-4610-9c23-23cde300fe6a-certs\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.169717 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.171674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/321cb82e-9541-4610-9c23-23cde300fe6a-node-bootstrap-token\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.171904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-metrics-tls\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.176591 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa3d6c58-9a36-4ec7-b450-89f39ca4d772-cert\") pod \"ingress-canary-zcqpr\" (UID: \"aa3d6c58-9a36-4ec7-b450-89f39ca4d772\") " pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.180660 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgr2f\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-kube-api-access-bgr2f\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.189238 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.204161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv8m6\" (UniqueName: \"kubernetes.io/projected/aa3d6c58-9a36-4ec7-b450-89f39ca4d772-kube-api-access-mv8m6\") pod \"ingress-canary-zcqpr\" (UID: \"aa3d6c58-9a36-4ec7-b450-89f39ca4d772\") " pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.209997 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zcqpr" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.240714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lw6x\" (UniqueName: \"kubernetes.io/projected/321cb82e-9541-4610-9c23-23cde300fe6a-kube-api-access-6lw6x\") pod \"machine-config-server-dtptp\" (UID: \"321cb82e-9541-4610-9c23-23cde300fe6a\") " pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.248068 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.248432 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.748418543 +0000 UTC m=+149.614898920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.273427 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksd54\" (UniqueName: \"kubernetes.io/projected/bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd-kube-api-access-ksd54\") pod \"dns-default-rwkcl\" (UID: \"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd\") " pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.289906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mt567" event={"ID":"de616b95-4db7-46d2-99bd-1f9cabddcb71","Type":"ContainerStarted","Data":"727b29c9360a48e4ad75c11d545d72bd99731f1cc70c3d495256e14d3f95b3d0"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.289998 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mt567" event={"ID":"de616b95-4db7-46d2-99bd-1f9cabddcb71","Type":"ContainerStarted","Data":"5c312114c44808b0d019b6793c86b8bd48ec237a04f920b7e16e7b0817019bb9"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.293328 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" event={"ID":"0cf14439-b34b-4036-bdb0-a9197b92d3d5","Type":"ContainerStarted","Data":"dc94fc05e5e89e9ea73ee06366e63c9095405a33cbc160809989e9d69ca67694"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.298492 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" event={"ID":"310d7c28-b320-4b2d-a53c-2a3e097ec4c1","Type":"ContainerStarted","Data":"ae1be071b70f2223ec915290b7a6b64625a04581a852e2823dacaca42ac90c03"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.307652 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vm645"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.309568 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" event={"ID":"777c909d-d188-4b2d-8939-630259436b33","Type":"ContainerStarted","Data":"c0e1e463d5ec4d8228afe0a02ca52897f51769250f844ab25a2ab7dd1bcdaf1c"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.320777 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" event={"ID":"75d0e41e-1a64-4157-b877-917b50f218a2","Type":"ContainerStarted","Data":"8679c6a9ccf8030f28612568a8e724eba38d8dd4d48574e5d093bb2e490a786c"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.320847 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" event={"ID":"75d0e41e-1a64-4157-b877-917b50f218a2","Type":"ContainerStarted","Data":"de1e6406ca6b6a3adaaa4b56b8a31be076feb3cf23324665efdd0d54a5fb5c94"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.322499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" event={"ID":"7c916477-5fc5-43cc-b409-01e423c554a2","Type":"ContainerStarted","Data":"b3cf98397c74aeb0b5af0433d79a2b3ae0cad7dedcb60bf39db0c7eb55ac8581"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.324295 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" event={"ID":"730c1b4c-9f80-40aa-bf4e-d6b519be241c","Type":"ContainerStarted","Data":"b3bec272c4afb7639f6591b06a02fdfd03a9f01078248d358db033c405d86284"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.330301 4776 generic.go:334] "Generic (PLEG): container finished" podID="67de407a-108d-4477-9d23-0c60805f8ad6" containerID="2e5e9812b14018e31a5eff5eabf0b7b9ce4e6c0f49cc36fd0e8c48a3fefd9b8f" exitCode=0 Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.330364 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" event={"ID":"67de407a-108d-4477-9d23-0c60805f8ad6","Type":"ContainerDied","Data":"2e5e9812b14018e31a5eff5eabf0b7b9ce4e6c0f49cc36fd0e8c48a3fefd9b8f"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.330402 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" event={"ID":"67de407a-108d-4477-9d23-0c60805f8ad6","Type":"ContainerStarted","Data":"947c45eaac728d264312650fbbd1b817cdf59cfe27039d6395f9da9f0224ff61"} Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.364416 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.364616 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.864579056 +0000 UTC m=+149.731059463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.365027 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.365487 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.865468934 +0000 UTC m=+149.731949311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.467081 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.471989 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:44.971968209 +0000 UTC m=+149.838448586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: W1204 09:41:44.492969 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d9a038_9fbd_4306_9e4a_00901ca865dc.slice/crio-1dfb36ebffe27c470511d5b2c9a5a649c52da7bb67564e7c069c5a8cb1b1bba6 WatchSource:0}: Error finding container 1dfb36ebffe27c470511d5b2c9a5a649c52da7bb67564e7c069c5a8cb1b1bba6: Status 404 returned error can't find the container with id 1dfb36ebffe27c470511d5b2c9a5a649c52da7bb67564e7c069c5a8cb1b1bba6 Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.501334 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dtptp" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.533523 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.551002 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.572276 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.572662 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.072645834 +0000 UTC m=+149.939126211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.627596 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd"] Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.673669 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.674275 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.174251917 +0000 UTC m=+150.040732294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.778489 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.799403 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.299306697 +0000 UTC m=+150.165787074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: W1204 09:41:44.870360 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode068a6af_d8aa_4828_af79_72459f1f5525.slice/crio-c185d7ac30bfc87ce3cd852c697ec6cd4335283005b5cb4ca52159e6b34159e2 WatchSource:0}: Error finding container c185d7ac30bfc87ce3cd852c697ec6cd4335283005b5cb4ca52159e6b34159e2: Status 404 returned error can't find the container with id c185d7ac30bfc87ce3cd852c697ec6cd4335283005b5cb4ca52159e6b34159e2 Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.880148 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.880686 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.380665884 +0000 UTC m=+150.247146261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.941299 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hv2mg" podStartSLOduration=129.941271069 podStartE2EDuration="2m9.941271069s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:44.932660432 +0000 UTC m=+149.799140809" watchObservedRunningTime="2025-12-04 09:41:44.941271069 +0000 UTC m=+149.807751456" Dec 04 09:41:44 crc kubenswrapper[4776]: I1204 09:41:44.993634 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:44 crc kubenswrapper[4776]: E1204 09:41:44.994261 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.494246328 +0000 UTC m=+150.360726705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.095542 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.096138 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.59611079 +0000 UTC m=+150.462591167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.108616 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mt567" podStartSLOduration=130.108279876 podStartE2EDuration="2m10.108279876s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:45.106336056 +0000 UTC m=+149.972816433" watchObservedRunningTime="2025-12-04 09:41:45.108279876 +0000 UTC m=+149.974760253" Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.197109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.197653 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.69762997 +0000 UTC m=+150.564110367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.217645 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.301764 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.302302 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.802276708 +0000 UTC m=+150.668757085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.354439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" event={"ID":"310d7c28-b320-4b2d-a53c-2a3e097ec4c1","Type":"ContainerStarted","Data":"48cac54fc46e7411742eda17936f1b9c23e4f15e1f169683db6f0851204a23cf"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.364600 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" event={"ID":"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5","Type":"ContainerStarted","Data":"010526104d409662a5c1f216054a0dd155179881dac546c4114c2e9458412451"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.388845 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dtptp" event={"ID":"321cb82e-9541-4610-9c23-23cde300fe6a","Type":"ContainerStarted","Data":"a190e86427e071c49d46dd3a45c774cacac4f10633c31743c7981e5a615467dd"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.397345 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" event={"ID":"5538449a-79a6-4f7b-aff4-4beea5729aa1","Type":"ContainerStarted","Data":"1332a78deceda79f2efbbf67a71080c556f9be2590bb6b04b5da57e547296774"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.404273 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.404838 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:45.9048169 +0000 UTC m=+150.771297287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.406092 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" event={"ID":"9adf6cb8-c061-4462-b8ee-aa3d945af0d3","Type":"ContainerStarted","Data":"c6450f1b15a57d7abd5bbadb0893a1186c35e467c82e1ae6177051abfd02ff4e"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.421192 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vm645" event={"ID":"52d9a038-9fbd-4306-9e4a-00901ca865dc","Type":"ContainerStarted","Data":"1dfb36ebffe27c470511d5b2c9a5a649c52da7bb67564e7c069c5a8cb1b1bba6"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.424697 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-j657w" event={"ID":"ef120503-a5bc-4bde-a9ac-b461f4961766","Type":"ContainerStarted","Data":"c2c2dbf02490997634e75a8038a046c8af24b7fb94ce9d09039a5ed209c070b8"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.435220 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.440174 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" event={"ID":"0cf14439-b34b-4036-bdb0-a9197b92d3d5","Type":"ContainerStarted","Data":"1d7ad689071946a30cdfd8dbae9c8c0a78414277bde4b512403d2bd486d7939a"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.441753 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.460050 4776 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-g96md container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.460105 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" podUID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.476814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" event={"ID":"e068a6af-d8aa-4828-af79-72459f1f5525","Type":"ContainerStarted","Data":"c185d7ac30bfc87ce3cd852c697ec6cd4335283005b5cb4ca52159e6b34159e2"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.477495 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" event={"ID":"89d737e7-e467-44ab-a18a-8e41e194e982","Type":"ContainerStarted","Data":"bd5e4f3057f6cc24b6ddbd5964708cdf9db5c90dcefe66810ad57c9c698e1211"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.477518 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" event={"ID":"89d737e7-e467-44ab-a18a-8e41e194e982","Type":"ContainerStarted","Data":"4186efd4c6e76515e583ad403c9004ba79817e45162a46af73b3562328197f8e"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.501651 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" event={"ID":"730c1b4c-9f80-40aa-bf4e-d6b519be241c","Type":"ContainerStarted","Data":"61a062883433148a2119369c794bbfaf4f8e25db7793d11d4115d75ba552302a"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.505092 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.506551 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.006526437 +0000 UTC m=+150.873006814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.517836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" event={"ID":"34aa75c9-39fc-49eb-b338-d2b1a36535a8","Type":"ContainerStarted","Data":"4fc8dbfd19f0a5c47d55a9bf1fefb4ad7e4956beb8db963bc733b32049668695"} Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.580143 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hnv9z"] Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.608453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.611381 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.111362271 +0000 UTC m=+150.977842648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.701474 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn"] Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.710143 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.710709 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.210675394 +0000 UTC m=+151.077155771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.710798 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.711340 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.211321214 +0000 UTC m=+151.077801781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.729904 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kqzwm"] Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.739145 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-smxws"] Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.753183 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs"] Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.770904 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zkz5f"] Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.797639 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:45 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:45 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:45 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.797719 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.813054 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.813589 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.313567947 +0000 UTC m=+151.180048324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.921741 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:45 crc kubenswrapper[4776]: E1204 09:41:45.922323 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.422301692 +0000 UTC m=+151.288782069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:45 crc kubenswrapper[4776]: I1204 09:41:45.982219 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" podStartSLOduration=130.982191874 podStartE2EDuration="2m10.982191874s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:45.98073647 +0000 UTC m=+150.847216847" watchObservedRunningTime="2025-12-04 09:41:45.982191874 +0000 UTC m=+150.848672251" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.019131 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-k9crr"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.028710 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.029323 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.529305142 +0000 UTC m=+151.395785519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.029283 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wwshr" podStartSLOduration=131.02924817 podStartE2EDuration="2m11.02924817s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.013878515 +0000 UTC m=+150.880358892" watchObservedRunningTime="2025-12-04 09:41:46.02924817 +0000 UTC m=+150.895728547" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.047644 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wt658"] Dec 04 09:41:46 crc kubenswrapper[4776]: W1204 09:41:46.089961 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7909bf76_0bc7_49e8_8711_f7229c71b3eb.slice/crio-70d385a93f80eb6c80cabfd8f57ccd5551b2fae610731c7e91ba6bc2ba54cf0c WatchSource:0}: Error finding container 70d385a93f80eb6c80cabfd8f57ccd5551b2fae610731c7e91ba6bc2ba54cf0c: Status 404 returned error can't find the container with id 70d385a93f80eb6c80cabfd8f57ccd5551b2fae610731c7e91ba6bc2ba54cf0c Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.130566 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.131332 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.631312788 +0000 UTC m=+151.497793165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.167768 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" podStartSLOduration=131.167736315 podStartE2EDuration="2m11.167736315s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.162271016 +0000 UTC m=+151.028751413" watchObservedRunningTime="2025-12-04 09:41:46.167736315 +0000 UTC m=+151.034216702" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.168138 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.197302 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.268990 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:46 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:46 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:46 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.269095 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.270785 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.273526 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.773503747 +0000 UTC m=+151.639984124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.275125 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.275661 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.775644433 +0000 UTC m=+151.642124810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.280095 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.285994 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.290544 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.293712 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.296797 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pb54k"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.302311 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qtmtj"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.303887 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.372330 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vcljg"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.376274 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.376579 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.876560305 +0000 UTC m=+151.743040682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.377827 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.424967 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f"] Dec 04 09:41:46 crc kubenswrapper[4776]: W1204 09:41:46.453046 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f87db82_b5ae_4241_9157_006a613f8425.slice/crio-3f52d7eb9dc0a1c6cc7ee7c88d19999e6ce387035d614bdfea07cbf34202aff4 WatchSource:0}: Error finding container 3f52d7eb9dc0a1c6cc7ee7c88d19999e6ce387035d614bdfea07cbf34202aff4: Status 404 returned error can't find the container with id 3f52d7eb9dc0a1c6cc7ee7c88d19999e6ce387035d614bdfea07cbf34202aff4 Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.477835 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.478459 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:46.978440528 +0000 UTC m=+151.844920905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: W1204 09:41:46.498424 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696d9668_ce83_427c_8b8c_cb069a6c1b26.slice/crio-119779973ee77e45e6aff8ac32b50272335b79d3f4e64cc00bea736a5abc637d WatchSource:0}: Error finding container 119779973ee77e45e6aff8ac32b50272335b79d3f4e64cc00bea736a5abc637d: Status 404 returned error can't find the container with id 119779973ee77e45e6aff8ac32b50272335b79d3f4e64cc00bea736a5abc637d Dec 04 09:41:46 crc kubenswrapper[4776]: W1204 09:41:46.525878 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2779d861_62c4_4852_a7d9_d93a4f37c673.slice/crio-cc1a96ff6cd877de5fcb3f96684d00c5acac1cd4dea90e753b593e639b26c901 WatchSource:0}: Error finding container cc1a96ff6cd877de5fcb3f96684d00c5acac1cd4dea90e753b593e639b26c901: Status 404 returned error can't find the container with id cc1a96ff6cd877de5fcb3f96684d00c5acac1cd4dea90e753b593e639b26c901 Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.526869 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rwkcl"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.542217 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zcqpr"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.547800 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.579738 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.580356 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.080335591 +0000 UTC m=+151.946815968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.584085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.585511 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.08549278 +0000 UTC m=+151.951973157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.600811 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" event={"ID":"730c1b4c-9f80-40aa-bf4e-d6b519be241c","Type":"ContainerStarted","Data":"46e7c7c5d45947b73a23b8365fb67147115a30d15300ad94dca31d39ed696d06"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.632795 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" event={"ID":"5f87db82-b5ae-4241-9157-006a613f8425","Type":"ContainerStarted","Data":"3f52d7eb9dc0a1c6cc7ee7c88d19999e6ce387035d614bdfea07cbf34202aff4"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.634509 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp"] Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.636908 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7kbz6" podStartSLOduration=131.636820958 podStartE2EDuration="2m11.636820958s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.632533116 +0000 UTC m=+151.499013493" watchObservedRunningTime="2025-12-04 09:41:46.636820958 +0000 UTC m=+151.503301335" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.643221 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" event={"ID":"e068a6af-d8aa-4828-af79-72459f1f5525","Type":"ContainerStarted","Data":"049a26defa0f583847acc717b1126f6b7dca47a29e9c75ddbec70a66e5150334"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.655730 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" event={"ID":"7c916477-5fc5-43cc-b409-01e423c554a2","Type":"ContainerStarted","Data":"e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.656364 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.660446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" event={"ID":"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff","Type":"ContainerStarted","Data":"7aef8750c996a812f931d6aaf1c39f893e86161723be92a409ae90520c39dcc8"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.718767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.721968 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.221927951 +0000 UTC m=+152.088408328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.723067 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.730290 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.230269509 +0000 UTC m=+152.096749886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.751420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" event={"ID":"ae9eb08a-3fec-4a55-8b96-01c7b0d88ee5","Type":"ContainerStarted","Data":"3cbfd60ee063d655b6e6c92cbd4ac66c1327548db93b295b13f8ce8b5ca04944"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.766159 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" podStartSLOduration=131.766129969 podStartE2EDuration="2m11.766129969s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.754435207 +0000 UTC m=+151.620915604" watchObservedRunningTime="2025-12-04 09:41:46.766129969 +0000 UTC m=+151.632610346" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.774368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" event={"ID":"009dc36d-7df7-43ff-b90e-db90aa95bb0b","Type":"ContainerStarted","Data":"cf86a622e51dce561bf95157d082abb3f7824da1fcfb29aaa90a2702ae675413"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.788901 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" event={"ID":"67de407a-108d-4477-9d23-0c60805f8ad6","Type":"ContainerStarted","Data":"1db61b8e47ac994914e371a0f9d112ea018a53e7f41ddc5acc2cb3571bcc0e3b"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.791020 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mwg2x" podStartSLOduration=131.791006428 podStartE2EDuration="2m11.791006428s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.790246895 +0000 UTC m=+151.656727262" watchObservedRunningTime="2025-12-04 09:41:46.791006428 +0000 UTC m=+151.657486805" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.826521 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.827142 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" podStartSLOduration=131.827114875 podStartE2EDuration="2m11.827114875s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.820037577 +0000 UTC m=+151.686517954" watchObservedRunningTime="2025-12-04 09:41:46.827114875 +0000 UTC m=+151.693595252" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.831962 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.331902604 +0000 UTC m=+152.198382991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.832263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" event={"ID":"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0","Type":"ContainerStarted","Data":"4d17671b2ebb7e1c985b09d315a4acb54b34a9d3767ba43413cdf39930a356c1"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.834607 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.838240 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.338219439 +0000 UTC m=+152.204699816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.844730 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dtptp" event={"ID":"321cb82e-9541-4610-9c23-23cde300fe6a","Type":"ContainerStarted","Data":"27da9d3284a30e1f99c8b253bd79db5866177363d4cd2191a357058385d375d5"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.847902 4776 generic.go:334] "Generic (PLEG): container finished" podID="777c909d-d188-4b2d-8939-630259436b33" containerID="e63c8f9c090d98079e62c4f75a9cafed995ae5b394e743d029dd784864ed4de6" exitCode=0 Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.848001 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" event={"ID":"777c909d-d188-4b2d-8939-630259436b33","Type":"ContainerDied","Data":"e63c8f9c090d98079e62c4f75a9cafed995ae5b394e743d029dd784864ed4de6"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.862293 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" event={"ID":"1e450f38-92b1-4da3-8cb6-353756403eb6","Type":"ContainerStarted","Data":"bf7aece4e38f4cae0b218a3d5e17b4a190a3f6596abe80b56ea81c861be8e3a8"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.886061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" event={"ID":"310d7c28-b320-4b2d-a53c-2a3e097ec4c1","Type":"ContainerStarted","Data":"7d00641129176b16fe378c9373f089ee4ef58c19b1ca0431fdfcd6728ee27609"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.910391 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dtptp" podStartSLOduration=6.910368312 podStartE2EDuration="6.910368312s" podCreationTimestamp="2025-12-04 09:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.87217814 +0000 UTC m=+151.738658527" watchObservedRunningTime="2025-12-04 09:41:46.910368312 +0000 UTC m=+151.776848689" Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.940506 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.953901 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" event={"ID":"9adf6cb8-c061-4462-b8ee-aa3d945af0d3","Type":"ContainerStarted","Data":"e511663ec8b18566988a80b5358cf38fe4ef1aba212e07d24c3670fa521d44b8"} Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.953903 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-thfjz" podStartSLOduration=131.953881348 podStartE2EDuration="2m11.953881348s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:46.953524347 +0000 UTC m=+151.820004724" watchObservedRunningTime="2025-12-04 09:41:46.953881348 +0000 UTC m=+151.820361725" Dec 04 09:41:46 crc kubenswrapper[4776]: E1204 09:41:46.957217 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.457111218 +0000 UTC m=+152.323591615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:46 crc kubenswrapper[4776]: I1204 09:41:46.999668 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" event={"ID":"34aa75c9-39fc-49eb-b338-d2b1a36535a8","Type":"ContainerStarted","Data":"4afec2289ce8320dd1d1c3f1532fbc67b14d7b6b3e23e6ec0502985ad61cd70a"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.020822 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" event={"ID":"3b021ce5-71f1-4fac-9096-e9a6e8e820c3","Type":"ContainerStarted","Data":"bb0909d65a048d4171497f58508c43c58f9c784315ef1e44512766d130849a99"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.032868 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" podStartSLOduration=132.032843201 podStartE2EDuration="2m12.032843201s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:47.028552168 +0000 UTC m=+151.895032545" watchObservedRunningTime="2025-12-04 09:41:47.032843201 +0000 UTC m=+151.899323578" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.043401 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.044155 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.544141121 +0000 UTC m=+152.410621498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.050624 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" event={"ID":"ba875c11-befa-4f9b-8475-4405e7c5e941","Type":"ContainerStarted","Data":"e62b47b91d17b57a0d97171f80353cf77c7a756c4ad6d25301b98eb95ba8c374"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.056315 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k9crr" event={"ID":"7909bf76-0bc7-49e8-8711-f7229c71b3eb","Type":"ContainerStarted","Data":"70d385a93f80eb6c80cabfd8f57ccd5551b2fae610731c7e91ba6bc2ba54cf0c"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.061097 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" event={"ID":"7e5763d8-8cd7-458d-90cc-c86a99c7207a","Type":"ContainerStarted","Data":"7ca13bffc5a5edf6827e992013be928dbbfe9188b69926aed3b8a355233b51d4"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.062554 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" event={"ID":"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e","Type":"ContainerStarted","Data":"5f78c471dec550665e7bb8b5bbe789ce17f93bb2a1302366c45c95cd622df32d"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.064470 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" event={"ID":"d040b600-e6e9-44e3-9e26-acdc2b4f8842","Type":"ContainerStarted","Data":"8afc906dbcabc01096baaee4e371d1d76840a165b5e24e7e9fe5fe5ab76d6b0c"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.066522 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" event={"ID":"cb24f3e7-571e-472f-8945-f50d74e07994","Type":"ContainerStarted","Data":"a7e9646d8bc86f6b9c62d5dd3d6b3a19bc97f0a1d41042ffa8853d3cda7414ca"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.067637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" event={"ID":"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1","Type":"ContainerStarted","Data":"fe673cea88b3d94490a053e7acbf63da9cc83a514b0a755f17275aa5fd64a76e"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.073823 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" podStartSLOduration=132.073803608 podStartE2EDuration="2m12.073803608s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:47.071779546 +0000 UTC m=+151.938259943" watchObservedRunningTime="2025-12-04 09:41:47.073803608 +0000 UTC m=+151.940283985" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.078526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" event={"ID":"694e1451-01cf-44d2-8d11-73468a1f0db9","Type":"ContainerStarted","Data":"4466aaee9198217805038b172846c8d7ace2cfa1b534e635c27a6b7356a161fd"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.080640 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" event={"ID":"d11245f8-3b53-4363-babf-6d47d9628e1b","Type":"ContainerStarted","Data":"f42459d10987ab140a4a3219ab326d2252c98feb3d0b9e44f52ddff2b8589b55"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.082064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" event={"ID":"5538449a-79a6-4f7b-aff4-4beea5729aa1","Type":"ContainerStarted","Data":"56b97c36d753b6e637b6d126032ed1b2354a941f206280cc0c07696f0413f11d"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.090047 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" event={"ID":"0c28ea18-b69e-4407-9e39-9a743bc3131c","Type":"ContainerStarted","Data":"9b8c6762fd9196657ff39f7bbd6c0a001215f82475de77244c7151159d383537"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.091677 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" event={"ID":"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17","Type":"ContainerStarted","Data":"64d72fa499fcf831c8986a3963962fc4e7a4bc3e881ad4f96995a7e5b4069385"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.093261 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-j657w" event={"ID":"ef120503-a5bc-4bde-a9ac-b461f4961766","Type":"ContainerStarted","Data":"6cd9ceee23e96c6197602720779dd9286f04a9d3cb573b9c73030f4a9fc5fca0"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.093558 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.096342 4776 patch_prober.go:28] interesting pod/console-operator-58897d9998-j657w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.096394 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-j657w" podUID="ef120503-a5bc-4bde-a9ac-b461f4961766" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.098191 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vm645" event={"ID":"52d9a038-9fbd-4306-9e4a-00901ca865dc","Type":"ContainerStarted","Data":"1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.107192 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" event={"ID":"696d9668-ce83-427c-8b8c-cb069a6c1b26","Type":"ContainerStarted","Data":"119779973ee77e45e6aff8ac32b50272335b79d3f4e64cc00bea736a5abc637d"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.109824 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vlj8w" podStartSLOduration=132.109808902 podStartE2EDuration="2m12.109808902s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:47.103181167 +0000 UTC m=+151.969661554" watchObservedRunningTime="2025-12-04 09:41:47.109808902 +0000 UTC m=+151.976289289" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.118607 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" event={"ID":"10e1f0c5-eed7-40dd-871d-aa574ed684b7","Type":"ContainerStarted","Data":"b053b954eb922d238d96eb0af51d6f3ebaf234ff55f7c718905204709bf87928"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.122054 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-j657w" podStartSLOduration=132.122041051 podStartE2EDuration="2m12.122041051s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:47.121486873 +0000 UTC m=+151.987967250" watchObservedRunningTime="2025-12-04 09:41:47.122041051 +0000 UTC m=+151.988521418" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.123813 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" event={"ID":"80508cf3-64e7-4f5d-848f-055be21a60d2","Type":"ContainerStarted","Data":"10b43b4f20d84293af656969714ead78c2bd34839e8f9ed3b275d2033747dd21"} Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.133305 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.145241 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.149794 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.649747028 +0000 UTC m=+152.516227435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.150618 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" podStartSLOduration=132.150596764 podStartE2EDuration="2m12.150596764s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:47.150531852 +0000 UTC m=+152.017012229" watchObservedRunningTime="2025-12-04 09:41:47.150596764 +0000 UTC m=+152.017077141" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.174320 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vm645" podStartSLOduration=132.174231875 podStartE2EDuration="2m12.174231875s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:47.172503292 +0000 UTC m=+152.038983669" watchObservedRunningTime="2025-12-04 09:41:47.174231875 +0000 UTC m=+152.040712252" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.214867 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:47 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:47 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:47 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.215272 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.260028 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.266111 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.766087798 +0000 UTC m=+152.632568175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.323808 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.361804 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.362289 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.862250622 +0000 UTC m=+152.728730999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.463359 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.463846 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:47.963830576 +0000 UTC m=+152.830310953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.564324 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.565011 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.064988715 +0000 UTC m=+152.931469092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.666783 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.667172 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.167155456 +0000 UTC m=+153.033635823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.767751 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.768368 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.268346147 +0000 UTC m=+153.134826534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.871121 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.871883 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.37186137 +0000 UTC m=+153.238341747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.978825 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.979049 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.479011875 +0000 UTC m=+153.345492252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:47 crc kubenswrapper[4776]: I1204 09:41:47.979535 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:47 crc kubenswrapper[4776]: E1204 09:41:47.980084 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.480065628 +0000 UTC m=+153.346546015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.081055 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.081220 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.581192016 +0000 UTC m=+153.447672393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.081531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.081848 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.581836976 +0000 UTC m=+153.448317353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.146624 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" event={"ID":"694e1451-01cf-44d2-8d11-73468a1f0db9","Type":"ContainerStarted","Data":"82b403f9e6719e95621b9482c8a4b448439f14a8d196628cb6a85e844a83bc2f"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.149192 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hnv9z" event={"ID":"009dc36d-7df7-43ff-b90e-db90aa95bb0b","Type":"ContainerStarted","Data":"4548a34751300a7ebb55864446d7e18134adea639186778b6788523fc4b18d90"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.150360 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" event={"ID":"2779d861-62c4-4852-a7d9-d93a4f37c673","Type":"ContainerStarted","Data":"cc1a96ff6cd877de5fcb3f96684d00c5acac1cd4dea90e753b593e639b26c901"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.151689 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" event={"ID":"1e450f38-92b1-4da3-8cb6-353756403eb6","Type":"ContainerStarted","Data":"143f2f0293a3c439f83258d2d51d12e0a8d50d98009d90d4bed6731e554750ce"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.151880 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.153885 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" event={"ID":"54c3ab7c-9388-45a8-b328-83e4e841ca91","Type":"ContainerStarted","Data":"d7c65b453758b5be9dd2779bab0eb82863109c49feedac23dad9f29eddbaf993"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.154378 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-smxws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.154450 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.156271 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" event={"ID":"9adf6cb8-c061-4462-b8ee-aa3d945af0d3","Type":"ContainerStarted","Data":"779837610ab50a1aff9ceaad7480ad0d8edf182236e74e2772886823ba5b587a"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.157904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zcqpr" event={"ID":"aa3d6c58-9a36-4ec7-b450-89f39ca4d772","Type":"ContainerStarted","Data":"e17f6c35627ac93ecedc7accc95cfbabd7afef739542a129ee53401775e73a37"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.162957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" event={"ID":"0c28ea18-b69e-4407-9e39-9a743bc3131c","Type":"ContainerStarted","Data":"cef1c3942b6915a5dd71c8fec98fa8d9f7aec0aad04d971215e933c8c2e17ad7"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.164165 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.174253 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g7rfs" event={"ID":"ba875c11-befa-4f9b-8475-4405e7c5e941","Type":"ContainerStarted","Data":"46f2db19c5bbfd0efb981065d6415f42e5f54ee48732a4e54039e0b9522530a5"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.174365 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.189271 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.189719 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.689682213 +0000 UTC m=+153.556162580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.194511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lc8p8" event={"ID":"34aa75c9-39fc-49eb-b338-d2b1a36535a8","Type":"ContainerStarted","Data":"f5bad58aff6033982bf48148823ea63e3b118dec4a871e2f0b29cec245c7fa6c"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.208061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwkcl" event={"ID":"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd","Type":"ContainerStarted","Data":"ad34d4b1ab0d68d6813e7a9f0e308281b66f9dcec025e6bdbb8c3af63042e2ac"} Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.218522 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" podStartSLOduration=133.218499114 podStartE2EDuration="2m13.218499114s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:48.21837197 +0000 UTC m=+153.084852347" watchObservedRunningTime="2025-12-04 09:41:48.218499114 +0000 UTC m=+153.084979491" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.222029 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x7mp5" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.222236 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:48 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:48 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:48 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.222282 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.292234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.292669 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.792651379 +0000 UTC m=+153.659131756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.315987 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-j657w" Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.394076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.394565 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.894541762 +0000 UTC m=+153.761022139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.496946 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.498417 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:48.998397854 +0000 UTC m=+153.864878231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.599603 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.600038 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.100012638 +0000 UTC m=+153.966493015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.707313 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.707768 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.207754002 +0000 UTC m=+154.074234379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.814806 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.815895 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.315871187 +0000 UTC m=+154.182351564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:48 crc kubenswrapper[4776]: I1204 09:41:48.919526 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:48 crc kubenswrapper[4776]: E1204 09:41:48.921075 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.421056261 +0000 UTC m=+154.287536638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.023879 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.024718 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.524694997 +0000 UTC m=+154.391175374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.126755 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.132218 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.632192774 +0000 UTC m=+154.498673151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.213753 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:49 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:49 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:49 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.213846 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.234491 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.235721 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.735691386 +0000 UTC m=+154.602171763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.263106 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" event={"ID":"777c909d-d188-4b2d-8939-630259436b33","Type":"ContainerStarted","Data":"51682c6b78a92cb21e09bc1cbca4f1cc749b86177f4fe64b003b50a3c6365a55"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.266148 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" event={"ID":"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17","Type":"ContainerStarted","Data":"47660be036b97a021b847481665ad6a5fc760f50a090b30a8cb4fd1caa779898"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.299085 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" event={"ID":"0be5eba5-d1a4-4c62-97c0-33ec1ffe839e","Type":"ContainerStarted","Data":"66126d465a03a9c2f1b665e9492b4a142f1a575096ff9ababf61486c0449dc12"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.303144 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" event={"ID":"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0","Type":"ContainerStarted","Data":"cf82d5264ce1550f56b7c9ff036a487c544add117af7646f2d25f603ce35b912"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.308667 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" podStartSLOduration=134.308624032 podStartE2EDuration="2m14.308624032s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.3040172 +0000 UTC m=+154.170497587" watchObservedRunningTime="2025-12-04 09:41:49.308624032 +0000 UTC m=+154.175104409" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.319061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" event={"ID":"694e1451-01cf-44d2-8d11-73468a1f0db9","Type":"ContainerStarted","Data":"3e55df6cd3410c91f32842942e3aecae32e536b721954a9f10ebef6cb7cf6dd1"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.332064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" event={"ID":"d1732b06-bd44-4e9a-8aa2-ed667a1ed5ff","Type":"ContainerStarted","Data":"93e15e7a663480696d867df054cf2d90dd669ee03ef00158b5c4640d687fb6ef"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.336830 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.337233 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.837218897 +0000 UTC m=+154.703699274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.340352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" event={"ID":"5f87db82-b5ae-4241-9157-006a613f8425","Type":"ContainerStarted","Data":"7524e1379c818e63afcc9da5832def86e6535c0ceb64277d7fef5f7e38b334ee"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.350594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" event={"ID":"696d9668-ce83-427c-8b8c-cb069a6c1b26","Type":"ContainerStarted","Data":"65429c038dcd9bb572c03c1d9be418638624cad7b2493f7e1bbe3c7d0f6dfa52"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.360680 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" podStartSLOduration=134.360655412 podStartE2EDuration="2m14.360655412s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.359749955 +0000 UTC m=+154.226230332" watchObservedRunningTime="2025-12-04 09:41:49.360655412 +0000 UTC m=+154.227135789" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.362698 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwkcl" event={"ID":"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd","Type":"ContainerStarted","Data":"a57ecd156f160b0fa11684c093b3d1e8ed13f2c26f43b4b3b32aa5749f156a40"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.380206 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.380264 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.387017 4776 generic.go:334] "Generic (PLEG): container finished" podID="3b021ce5-71f1-4fac-9096-e9a6e8e820c3" containerID="0ae83d23b90726e2d275156d430045fe731c55b2a8ae74ea3ca3dc8803701fb4" exitCode=0 Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.387338 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" event={"ID":"3b021ce5-71f1-4fac-9096-e9a6e8e820c3","Type":"ContainerDied","Data":"0ae83d23b90726e2d275156d430045fe731c55b2a8ae74ea3ca3dc8803701fb4"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.413184 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" event={"ID":"d040b600-e6e9-44e3-9e26-acdc2b4f8842","Type":"ContainerStarted","Data":"531047c5a177536a6e9cd92b63f54f877ac639b3a6d52542d1fa7e402cf291a1"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.413509 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" event={"ID":"d040b600-e6e9-44e3-9e26-acdc2b4f8842","Type":"ContainerStarted","Data":"d77e4ace9f2e0fa9b5a48a9b2ac1f269e11cbd7f3fc48555278c4ae6dd1945d3"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.414539 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.443645 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.445602 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:49.945572329 +0000 UTC m=+154.812052866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.447142 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" event={"ID":"2779d861-62c4-4852-a7d9-d93a4f37c673","Type":"ContainerStarted","Data":"7721b7cd77ce639c4b41a2cc7563f70d494bafbda3fdf024dddcf32396ac3420"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.469827 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pxqkt" podStartSLOduration=134.469801249 podStartE2EDuration="2m14.469801249s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.467545649 +0000 UTC m=+154.334026036" watchObservedRunningTime="2025-12-04 09:41:49.469801249 +0000 UTC m=+154.336281626" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.487583 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" event={"ID":"7e5763d8-8cd7-458d-90cc-c86a99c7207a","Type":"ContainerStarted","Data":"440f02cc80318362c407e4e96f900c256222d05e445167affc6212ccbe1c670a"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.491862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" event={"ID":"54c3ab7c-9388-45a8-b328-83e4e841ca91","Type":"ContainerStarted","Data":"4e3aa009c6460ee4a0e93af82f44f3fb34a3b255af7d0dd50476d7fa29b56d8d"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.494846 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" event={"ID":"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1","Type":"ContainerStarted","Data":"9a283fc9a76f10fed2d700c36074a2c00f24c9003bdae0f3c71089f79d6d646e"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.496294 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" event={"ID":"80508cf3-64e7-4f5d-848f-055be21a60d2","Type":"ContainerStarted","Data":"58a0f621daf94cdd62d174247d0ae953fa91b200d8bdeff0e9007cf51dd0551d"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.497497 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k9crr" event={"ID":"7909bf76-0bc7-49e8-8711-f7229c71b3eb","Type":"ContainerStarted","Data":"007d7bd34a9bb72894bf3b2004e0df1a68c4adbd055c71c047f16439b55cc595"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.501327 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.514128 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.514491 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.529552 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sdtzn" podStartSLOduration=134.529527207 podStartE2EDuration="2m14.529527207s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.526599946 +0000 UTC m=+154.393080353" watchObservedRunningTime="2025-12-04 09:41:49.529527207 +0000 UTC m=+154.396007584" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.542295 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" event={"ID":"e068a6af-d8aa-4828-af79-72459f1f5525","Type":"ContainerStarted","Data":"5ebf0e3571ff3b3a4f9b937e9c9f09ca623442dc78b7c709821fc9103fa87e77"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.547347 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" event={"ID":"d11245f8-3b53-4363-babf-6d47d9628e1b","Type":"ContainerStarted","Data":"c45a0fb5b0f3e0fb6062c36ed2d682eca9697271cef1be5f6213ce49c05b3ba4"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.553335 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.554231 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.556167 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.056154111 +0000 UTC m=+154.922634488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.557851 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" event={"ID":"10e1f0c5-eed7-40dd-871d-aa574ed684b7","Type":"ContainerStarted","Data":"d3525077c29562a3e4e473cc3cdf50e0dcb2b14f92eec35e9e1cf08ca347943d"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.559218 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.561289 4776 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lxr6t container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" start-of-body= Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.561414 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" podUID="10e1f0c5-eed7-40dd-871d-aa574ed684b7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": dial tcp 10.217.0.24:5443: connect: connection refused" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.571904 4776 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-97ppf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.574139 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" podUID="d11245f8-3b53-4363-babf-6d47d9628e1b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.585381 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zcqpr" event={"ID":"aa3d6c58-9a36-4ec7-b450-89f39ca4d772","Type":"ContainerStarted","Data":"1b490d207465ba78e8bbaddad5233326a7eeeb6130e1c9fc9616c3a4c5421760"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.594497 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" event={"ID":"cb24f3e7-571e-472f-8945-f50d74e07994","Type":"ContainerStarted","Data":"d9ebaeaac9fd4adebf6ce7756c70931d66e85f71e21310a8f7b29ff86fdd8f08"} Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.597964 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.602028 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-smxws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.602093 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.617967 4776 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgsdv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.618120 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" podUID="cb24f3e7-571e-472f-8945-f50d74e07994" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.652891 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xcpjt" podStartSLOduration=134.652865993 podStartE2EDuration="2m14.652865993s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.6495334 +0000 UTC m=+154.516013777" watchObservedRunningTime="2025-12-04 09:41:49.652865993 +0000 UTC m=+154.519346370" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.657025 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.657457 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.157439754 +0000 UTC m=+155.023920131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.658197 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.663112 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.163088469 +0000 UTC m=+155.029569036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.721299 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wt658" podStartSLOduration=134.72126821 podStartE2EDuration="2m14.72126821s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.717169343 +0000 UTC m=+154.583649740" watchObservedRunningTime="2025-12-04 09:41:49.72126821 +0000 UTC m=+154.587748587" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.760340 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.765168 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.265143527 +0000 UTC m=+155.131623904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.774831 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4xnmk" podStartSLOduration=134.774806486 podStartE2EDuration="2m14.774806486s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.772552046 +0000 UTC m=+154.639032423" watchObservedRunningTime="2025-12-04 09:41:49.774806486 +0000 UTC m=+154.641286863" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.834968 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" podStartSLOduration=134.834940836 podStartE2EDuration="2m14.834940836s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.83247674 +0000 UTC m=+154.698957127" watchObservedRunningTime="2025-12-04 09:41:49.834940836 +0000 UTC m=+154.701421213" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.864603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.865162 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.365145101 +0000 UTC m=+155.231625478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.882888 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" podStartSLOduration=134.882853569 podStartE2EDuration="2m14.882853569s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:49.882124776 +0000 UTC m=+154.748605163" watchObservedRunningTime="2025-12-04 09:41:49.882853569 +0000 UTC m=+154.749333956" Dec 04 09:41:49 crc kubenswrapper[4776]: I1204 09:41:49.973433 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:49 crc kubenswrapper[4776]: E1204 09:41:49.974607 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.474560686 +0000 UTC m=+155.341041063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.016477 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" podStartSLOduration=135.016453452 podStartE2EDuration="2m15.016453452s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.015134422 +0000 UTC m=+154.881614799" watchObservedRunningTime="2025-12-04 09:41:50.016453452 +0000 UTC m=+154.882933829" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.075696 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.076274 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.576255983 +0000 UTC m=+155.442736360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.096853 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kk2fd" podStartSLOduration=135.096829729 podStartE2EDuration="2m15.096829729s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.09556232 +0000 UTC m=+154.962042717" watchObservedRunningTime="2025-12-04 09:41:50.096829729 +0000 UTC m=+154.963310106" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.159806 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" podStartSLOduration=135.159767496 podStartE2EDuration="2m15.159767496s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.157672572 +0000 UTC m=+155.024152959" watchObservedRunningTime="2025-12-04 09:41:50.159767496 +0000 UTC m=+155.026247883" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.177734 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.178128 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.678108253 +0000 UTC m=+155.544588630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.214631 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:50 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:50 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:50 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.214724 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.228632 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zcqpr" podStartSLOduration=10.228604726 podStartE2EDuration="10.228604726s" podCreationTimestamp="2025-12-04 09:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.219840665 +0000 UTC m=+155.086321062" watchObservedRunningTime="2025-12-04 09:41:50.228604726 +0000 UTC m=+155.095085093" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.280020 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.280059 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qtmtj" podStartSLOduration=135.280033977 podStartE2EDuration="2m15.280033977s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.279319925 +0000 UTC m=+155.145800302" watchObservedRunningTime="2025-12-04 09:41:50.280033977 +0000 UTC m=+155.146514364" Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.280548 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.780529503 +0000 UTC m=+155.647009890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.349782 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-75562" podStartSLOduration=135.34974527399999 podStartE2EDuration="2m15.349745274s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.340437036 +0000 UTC m=+155.206917413" watchObservedRunningTime="2025-12-04 09:41:50.349745274 +0000 UTC m=+155.216225651" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.381665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.382133 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.882113256 +0000 UTC m=+155.748593633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.484296 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.485297 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:50.985282938 +0000 UTC m=+155.851763315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.585616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.585866 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.085824179 +0000 UTC m=+155.952304576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.586258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.587094 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.087065466 +0000 UTC m=+155.953545843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.612141 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" event={"ID":"5f87db82-b5ae-4241-9157-006a613f8425","Type":"ContainerStarted","Data":"afba8ca50e51859cc92b9b45e6b44066f129594a45a01363a140a79839def58b"} Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.667901 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-k9crr" podStartSLOduration=135.667875198 podStartE2EDuration="2m15.667875198s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.667043161 +0000 UTC m=+155.533523558" watchObservedRunningTime="2025-12-04 09:41:50.667875198 +0000 UTC m=+155.534355575" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.670357 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lv9hp" podStartSLOduration=135.670337974 podStartE2EDuration="2m15.670337974s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.511437037 +0000 UTC m=+155.377917414" watchObservedRunningTime="2025-12-04 09:41:50.670337974 +0000 UTC m=+155.536818361" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.675254 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" event={"ID":"2d3500e4-b2b0-4b3b-b8e2-c0940080d2c0","Type":"ContainerStarted","Data":"e68e8ea84d411e4aed3f44dc531ed7c1f45bbcb62a94ba3782c420dfe6a5a355"} Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.691720 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.692024 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.191979553 +0000 UTC m=+156.058459930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.692144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.692665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwkcl" event={"ID":"bf1ecc29-3516-4f77-8dc2-3e3e4e2ac4bd","Type":"ContainerStarted","Data":"cc5ac88bd399c77a608503a730adc83fe0391d7035b7224dda44bbc2424a0800"} Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.692977 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.192965144 +0000 UTC m=+156.059445521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.693668 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.728120 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" event={"ID":"3b021ce5-71f1-4fac-9096-e9a6e8e820c3","Type":"ContainerStarted","Data":"2dcf757b2034171f20344bbd2a62be466d3cd3eabc937d15ba425bcc7f5cb755"} Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.739821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" event={"ID":"80508cf3-64e7-4f5d-848f-055be21a60d2","Type":"ContainerStarted","Data":"10b97a2770ffeb107f65ee278de2e9cb9a48502e13bfda4c171ab3eeea800157"} Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.742641 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.742697 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.761135 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgsdv" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.793624 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.795812 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.295788695 +0000 UTC m=+156.162269072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.819146 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dcjnl" podStartSLOduration=135.819117376 podStartE2EDuration="2m15.819117376s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.816520116 +0000 UTC m=+155.683000513" watchObservedRunningTime="2025-12-04 09:41:50.819117376 +0000 UTC m=+155.685597753" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.832493 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.899352 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:50 crc kubenswrapper[4776]: E1204 09:41:50.903472 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.403442495 +0000 UTC m=+156.269922902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:50 crc kubenswrapper[4776]: I1204 09:41:50.926902 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xf9p8" podStartSLOduration=135.92687632 podStartE2EDuration="2m15.92687632s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:50.926639823 +0000 UTC m=+155.793120200" watchObservedRunningTime="2025-12-04 09:41:50.92687632 +0000 UTC m=+155.793356697" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.006344 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.006577 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.506539826 +0000 UTC m=+156.373020203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.007318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.007778 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.507768463 +0000 UTC m=+156.374248840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.038314 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rwkcl" podStartSLOduration=10.038282617 podStartE2EDuration="10.038282617s" podCreationTimestamp="2025-12-04 09:41:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:51.034573953 +0000 UTC m=+155.901054330" watchObservedRunningTime="2025-12-04 09:41:51.038282617 +0000 UTC m=+155.904762994" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.073010 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqzwm" podStartSLOduration=136.072982331 podStartE2EDuration="2m16.072982331s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:51.071020631 +0000 UTC m=+155.937501018" watchObservedRunningTime="2025-12-04 09:41:51.072982331 +0000 UTC m=+155.939462728" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.108629 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.108857 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.60882222 +0000 UTC m=+156.475302617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.108942 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.109431 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.609412538 +0000 UTC m=+156.475892915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.210138 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.210892 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.710872878 +0000 UTC m=+156.577353255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.217776 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:51 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:51 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:51 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.218239 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.311955 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.312481 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.81245546 +0000 UTC m=+156.678935837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.368878 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bhnrh"] Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.370800 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.376858 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.396895 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhnrh"] Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.428308 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.428536 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.92849735 +0000 UTC m=+156.794977737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.428810 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.429350 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:51.929333717 +0000 UTC m=+156.795814094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.529993 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.530274 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.030230947 +0000 UTC m=+156.896711344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.530450 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjh2\" (UniqueName: \"kubernetes.io/projected/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-kube-api-access-4cjh2\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.530478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-utilities\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.530520 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-catalog-content\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.530562 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.530966 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.03094956 +0000 UTC m=+156.897429927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.632371 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.632597 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.132562534 +0000 UTC m=+156.999042911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.632711 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-catalog-content\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.632819 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.633025 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjh2\" (UniqueName: \"kubernetes.io/projected/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-kube-api-access-4cjh2\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.633055 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-utilities\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.633294 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-catalog-content\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.633346 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.133324748 +0000 UTC m=+156.999805125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.633540 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-utilities\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.661527 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjh2\" (UniqueName: \"kubernetes.io/projected/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-kube-api-access-4cjh2\") pod \"certified-operators-bhnrh\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.700644 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.733500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.733892 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.233850908 +0000 UTC m=+157.100331275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.743230 4776 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lxr6t container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.743324 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" podUID="10e1f0c5-eed7-40dd-871d-aa574ed684b7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.749932 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdjvq"] Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.751262 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.775361 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" event={"ID":"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1","Type":"ContainerStarted","Data":"bfad4bc8a5ea95a0bc89342518fcfc06f93cdf8ea695a97cdcbd416d3507e1d2"} Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.797741 4776 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.819472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" event={"ID":"3b021ce5-71f1-4fac-9096-e9a6e8e820c3","Type":"ContainerStarted","Data":"c6635dec4c1af09658e95c84f8bbf4b61206f897b153be80fe7f01f25aad28bc"} Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.820302 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.820365 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.839452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48tq\" (UniqueName: \"kubernetes.io/projected/0f7b6e89-f248-4b5e-82ed-809ef10f018e-kube-api-access-x48tq\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.839627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-catalog-content\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.839662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.839748 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-utilities\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.845996 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.345972617 +0000 UTC m=+157.212453194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.882510 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdjvq"] Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.897462 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" podStartSLOduration=136.897427908 podStartE2EDuration="2m16.897427908s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:51.879562116 +0000 UTC m=+156.746042513" watchObservedRunningTime="2025-12-04 09:41:51.897427908 +0000 UTC m=+156.763908285" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.943195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.943941 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.443889767 +0000 UTC m=+157.310370144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.944445 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48tq\" (UniqueName: \"kubernetes.io/projected/0f7b6e89-f248-4b5e-82ed-809ef10f018e-kube-api-access-x48tq\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.944961 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-catalog-content\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.945056 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.945141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-utilities\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.945802 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-utilities\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.946126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-catalog-content\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: E1204 09:41:51.946522 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.446507838 +0000 UTC m=+157.312988215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.980346 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48tq\" (UniqueName: \"kubernetes.io/projected/0f7b6e89-f248-4b5e-82ed-809ef10f018e-kube-api-access-x48tq\") pod \"certified-operators-kdjvq\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.986538 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lxr6t" Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.996877 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j2289"] Dec 04 09:41:51 crc kubenswrapper[4776]: I1204 09:41:51.999549 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.012703 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.073493 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.074312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-utilities\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.074408 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srv4b\" (UniqueName: \"kubernetes.io/projected/719304d2-2416-40be-b76a-ca884c683161-kube-api-access-srv4b\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.074490 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-catalog-content\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.076074 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.576050386 +0000 UTC m=+157.442530763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.085073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.097459 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2289"] Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.183936 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.184029 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-utilities\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.184063 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srv4b\" (UniqueName: \"kubernetes.io/projected/719304d2-2416-40be-b76a-ca884c683161-kube-api-access-srv4b\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.184095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-catalog-content\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.184732 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-catalog-content\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.185163 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.685144381 +0000 UTC m=+157.551624758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.185687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-utilities\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.215218 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:52 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:52 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:52 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.215296 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.260021 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srv4b\" (UniqueName: \"kubernetes.io/projected/719304d2-2416-40be-b76a-ca884c683161-kube-api-access-srv4b\") pod \"community-operators-j2289\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.279949 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4sbfh"] Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.281246 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.288760 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.289270 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.789243101 +0000 UTC m=+157.655723478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.296794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.297345 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.797330231 +0000 UTC m=+157.663810598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.332659 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sbfh"] Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.339300 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.400662 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.400832 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.900799663 +0000 UTC m=+157.767280030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.400957 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.401024 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xs2d\" (UniqueName: \"kubernetes.io/projected/78f24e33-4605-4ded-98fc-83e96aa46b09-kube-api-access-4xs2d\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.401048 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-utilities\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.401065 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-catalog-content\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.401472 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:52.901462923 +0000 UTC m=+157.767943520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.510786 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.511194 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:53.011142087 +0000 UTC m=+157.877622464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.511430 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xs2d\" (UniqueName: \"kubernetes.io/projected/78f24e33-4605-4ded-98fc-83e96aa46b09-kube-api-access-4xs2d\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.511465 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-utilities\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.511483 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-catalog-content\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.511579 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.512390 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-utilities\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.512961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-catalog-content\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.513045 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:53.013030855 +0000 UTC m=+157.879511232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.602882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xs2d\" (UniqueName: \"kubernetes.io/projected/78f24e33-4605-4ded-98fc-83e96aa46b09-kube-api-access-4xs2d\") pod \"community-operators-4sbfh\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.611392 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.612678 4776 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-04T09:41:51.797787596Z","Handler":null,"Name":""} Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.614126 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.614480 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 09:41:53.114438792 +0000 UTC m=+157.980919179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.614612 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: E1204 09:41:52.615231 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 09:41:53.115223067 +0000 UTC m=+157.981703444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkbvc" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.625217 4776 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.625279 4776 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.726195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.736136 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.829206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.866816 4776 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.866902 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.916047 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.917670 4776 generic.go:334] "Generic (PLEG): container finished" podID="a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" containerID="47660be036b97a021b847481665ad6a5fc760f50a090b30a8cb4fd1caa779898" exitCode=0 Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.918016 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" event={"ID":"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17","Type":"ContainerDied","Data":"47660be036b97a021b847481665ad6a5fc760f50a090b30a8cb4fd1caa779898"} Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.918705 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.941436 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:52 crc kubenswrapper[4776]: I1204 09:41:52.962635 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" event={"ID":"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1","Type":"ContainerStarted","Data":"2c2eaddc4294257136f2c1804724625f852449859d0821b59f38da335d9aa73c"} Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.008905 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdjvq"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.104166 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.104225 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.138163 4776 patch_prober.go:28] interesting pod/console-f9d7485db-vm645 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.138330 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vm645" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.211181 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.221352 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:53 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:53 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:53 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.221528 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.238138 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhnrh"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.248937 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkbvc\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:53 crc kubenswrapper[4776]: W1204 09:41:53.264070 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c47789_6fcb_4f3d_9b38_99643a8fe1a2.slice/crio-711fda8b721e0fa0a03189582178e53a5a8b28c791963a704633ba689b7a431e WatchSource:0}: Error finding container 711fda8b721e0fa0a03189582178e53a5a8b28c791963a704633ba689b7a431e: Status 404 returned error can't find the container with id 711fda8b721e0fa0a03189582178e53a5a8b28c791963a704633ba689b7a431e Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.368068 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2289"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.461544 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.477186 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.483799 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sbfh"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.538303 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.615027 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vrzln"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.617761 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.662904 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrzln"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.664419 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.664483 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.665213 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.678738 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.701119 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.685213 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.701273 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.783813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4rb\" (UniqueName: \"kubernetes.io/projected/b1d11071-0dee-4b5f-989e-36b89f0eb26f-kube-api-access-mx4rb\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.783881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-catalog-content\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.784220 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-utilities\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.894838 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4rb\" (UniqueName: \"kubernetes.io/projected/b1d11071-0dee-4b5f-989e-36b89f0eb26f-kube-api-access-mx4rb\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.895352 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-catalog-content\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.895457 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-utilities\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.896133 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-utilities\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.896670 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-catalog-content\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.938428 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4rb\" (UniqueName: \"kubernetes.io/projected/b1d11071-0dee-4b5f-989e-36b89f0eb26f-kube-api-access-mx4rb\") pod \"redhat-marketplace-vrzln\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.978734 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dcbwh"] Dec 04 09:41:53 crc kubenswrapper[4776]: I1204 09:41:53.980216 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.017749 4776 generic.go:334] "Generic (PLEG): container finished" podID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerID="f69194bdf91c186b7c2756398944222d3f4830e70889fd5fce967487fe8c661b" exitCode=0 Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.017814 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdjvq" event={"ID":"0f7b6e89-f248-4b5e-82ed-809ef10f018e","Type":"ContainerDied","Data":"f69194bdf91c186b7c2756398944222d3f4830e70889fd5fce967487fe8c661b"} Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.017845 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdjvq" event={"ID":"0f7b6e89-f248-4b5e-82ed-809ef10f018e","Type":"ContainerStarted","Data":"510996b1f23a75c8021a1c23e95116231af855c550fd1a0f202ce7487db5107d"} Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.024954 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcbwh"] Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.025436 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.026483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhnrh" event={"ID":"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2","Type":"ContainerStarted","Data":"711fda8b721e0fa0a03189582178e53a5a8b28c791963a704633ba689b7a431e"} Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.034270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerStarted","Data":"d3e09d6473adad5d26a6fd3f5d38b66e62bdacd9fcc49e369afb96ce6d5eb244"} Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.066606 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" event={"ID":"6e6e7228-bf39-47ac-ab7a-cf61cb5112c1","Type":"ContainerStarted","Data":"a13baf0d30acce5163485724bd1f5ca11044d01caf5b77039695e32aae655f6c"} Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.085611 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2289" event={"ID":"719304d2-2416-40be-b76a-ca884c683161","Type":"ContainerStarted","Data":"f25f50779d373cdc6969f51388b7533403b4cacb0a6350dea29ce72c61173f01"} Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.107951 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmgd\" (UniqueName: \"kubernetes.io/projected/740917a7-9acc-4dbd-8d31-329cfd0538b3-kube-api-access-vfmgd\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.108051 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-utilities\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.108116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-catalog-content\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.122166 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjmzc" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.146428 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.169879 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pb54k" podStartSLOduration=14.169856497 podStartE2EDuration="14.169856497s" podCreationTimestamp="2025-12-04 09:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:54.137055521 +0000 UTC m=+159.003535908" watchObservedRunningTime="2025-12-04 09:41:54.169856497 +0000 UTC m=+159.036336874" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.213621 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmgd\" (UniqueName: \"kubernetes.io/projected/740917a7-9acc-4dbd-8d31-329cfd0538b3-kube-api-access-vfmgd\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.213749 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-utilities\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.213851 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-catalog-content\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.216688 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-utilities\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.218646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-catalog-content\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.222746 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:54 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:54 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:54 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.222827 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.255149 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmgd\" (UniqueName: \"kubernetes.io/projected/740917a7-9acc-4dbd-8d31-329cfd0538b3-kube-api-access-vfmgd\") pod \"redhat-marketplace-dcbwh\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.258871 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkbvc"] Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.355353 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.788381 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrzln"] Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.822195 4776 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vcljg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]log ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]etcd ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/max-in-flight-filter ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 04 09:41:54 crc kubenswrapper[4776]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 04 09:41:54 crc kubenswrapper[4776]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/project.openshift.io-projectcache ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/openshift.io-startinformers ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 04 09:41:54 crc kubenswrapper[4776]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 09:41:54 crc kubenswrapper[4776]: livez check failed Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.822331 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" podUID="3b021ce5-71f1-4fac-9096-e9a6e8e820c3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.930821 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.954441 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m27lb"] Dec 04 09:41:54 crc kubenswrapper[4776]: E1204 09:41:54.956650 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" containerName="collect-profiles" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.956678 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" containerName="collect-profiles" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.956858 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" containerName="collect-profiles" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.958546 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.966961 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 09:41:54 crc kubenswrapper[4776]: I1204 09:41:54.970900 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m27lb"] Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.010003 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcbwh"] Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.026515 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-secret-volume\") pod \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.026615 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qkxf\" (UniqueName: \"kubernetes.io/projected/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-kube-api-access-8qkxf\") pod \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.026731 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-config-volume\") pod \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\" (UID: \"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17\") " Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.039052 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-config-volume" (OuterVolumeSpecName: "config-volume") pod "a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" (UID: "a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.054200 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-kube-api-access-8qkxf" (OuterVolumeSpecName: "kube-api-access-8qkxf") pod "a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" (UID: "a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17"). InnerVolumeSpecName "kube-api-access-8qkxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.059348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" (UID: "a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.097342 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcbwh" event={"ID":"740917a7-9acc-4dbd-8d31-329cfd0538b3","Type":"ContainerStarted","Data":"ac5fc97e972fb6d8d40cd7f3e9e89de0a057d96621b56ab86aed230a2b6dddc4"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.099431 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" event={"ID":"e358131f-46f1-40bc-9a4a-93798e8a303d","Type":"ContainerStarted","Data":"906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.099457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" event={"ID":"e358131f-46f1-40bc-9a4a-93798e8a303d","Type":"ContainerStarted","Data":"c3cce169b3de727cd5feea63d91562bac72096cd5955e27a0a72e8811de7013a"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.100103 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.109716 4776 generic.go:334] "Generic (PLEG): container finished" podID="719304d2-2416-40be-b76a-ca884c683161" containerID="17fb3685f4c4f718a37524327f1874fe66ffdd545c69b1eceecff349f9b9a12d" exitCode=0 Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.109812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2289" event={"ID":"719304d2-2416-40be-b76a-ca884c683161","Type":"ContainerDied","Data":"17fb3685f4c4f718a37524327f1874fe66ffdd545c69b1eceecff349f9b9a12d"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.112031 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.112102 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f" event={"ID":"a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17","Type":"ContainerDied","Data":"64d72fa499fcf831c8986a3963962fc4e7a4bc3e881ad4f96995a7e5b4069385"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.112148 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d72fa499fcf831c8986a3963962fc4e7a4bc3e881ad4f96995a7e5b4069385" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.116650 4776 generic.go:334] "Generic (PLEG): container finished" podID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerID="b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d" exitCode=0 Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.116766 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhnrh" event={"ID":"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2","Type":"ContainerDied","Data":"b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.129484 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85g6g\" (UniqueName: \"kubernetes.io/projected/0b861681-465a-4c51-8663-ecd652c7c7b0-kube-api-access-85g6g\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.129627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-utilities\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.129680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-catalog-content\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.129717 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.129750 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qkxf\" (UniqueName: \"kubernetes.io/projected/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-kube-api-access-8qkxf\") on node \"crc\" DevicePath \"\"" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.129761 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.130485 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" podStartSLOduration=140.130457407 podStartE2EDuration="2m20.130457407s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:55.120433837 +0000 UTC m=+159.986914214" watchObservedRunningTime="2025-12-04 09:41:55.130457407 +0000 UTC m=+159.996937784" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.157858 4776 generic.go:334] "Generic (PLEG): container finished" podID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerID="2a77b1ce84dc8fa8d318149cd943b11aa9b9f37558c0ec05a48bbd3ccdfc6cab" exitCode=0 Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.158023 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerDied","Data":"2a77b1ce84dc8fa8d318149cd943b11aa9b9f37558c0ec05a48bbd3ccdfc6cab"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.191378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrzln" event={"ID":"b1d11071-0dee-4b5f-989e-36b89f0eb26f","Type":"ContainerStarted","Data":"76ad2f758eaeab4ad1ece32bdbb03467c7400c6a6bad8d78a1334001789a9ebe"} Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.214782 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:55 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:55 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:55 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.214887 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.231800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-utilities\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.231864 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-catalog-content\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.231942 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85g6g\" (UniqueName: \"kubernetes.io/projected/0b861681-465a-4c51-8663-ecd652c7c7b0-kube-api-access-85g6g\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.233422 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-utilities\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.233743 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-catalog-content\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.252441 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85g6g\" (UniqueName: \"kubernetes.io/projected/0b861681-465a-4c51-8663-ecd652c7c7b0-kube-api-access-85g6g\") pod \"redhat-operators-m27lb\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.279512 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.357293 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jgwk9"] Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.360239 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.365749 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jgwk9"] Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.436632 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-catalog-content\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.437206 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-utilities\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.437737 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngcxf\" (UniqueName: \"kubernetes.io/projected/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-kube-api-access-ngcxf\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.543138 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-catalog-content\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.543196 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-utilities\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.543292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngcxf\" (UniqueName: \"kubernetes.io/projected/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-kube-api-access-ngcxf\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.544709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-catalog-content\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.544910 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-utilities\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.566945 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngcxf\" (UniqueName: \"kubernetes.io/projected/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-kube-api-access-ngcxf\") pod \"redhat-operators-jgwk9\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.596770 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m27lb"] Dec 04 09:41:55 crc kubenswrapper[4776]: W1204 09:41:55.614155 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b861681_465a_4c51_8663_ecd652c7c7b0.slice/crio-829dad834862ac28a0a471ed252b6237fa4fe9a8200ade4479b3fb8c7f2a56fe WatchSource:0}: Error finding container 829dad834862ac28a0a471ed252b6237fa4fe9a8200ade4479b3fb8c7f2a56fe: Status 404 returned error can't find the container with id 829dad834862ac28a0a471ed252b6237fa4fe9a8200ade4479b3fb8c7f2a56fe Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.624554 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.625486 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.629765 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.630096 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.639740 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.738298 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.748701 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.749112 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.850988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.851143 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.851633 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.871211 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.945392 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:41:55 crc kubenswrapper[4776]: I1204 09:41:55.965520 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.080186 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jgwk9"] Dec 04 09:41:56 crc kubenswrapper[4776]: W1204 09:41:56.090187 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9d9dc3_bdee_489a_9e89_3fc75bd3025e.slice/crio-54e25cc9ee53caaf678ff93c5de0de4be642540b8579d4f42fc6a8801c31fb30 WatchSource:0}: Error finding container 54e25cc9ee53caaf678ff93c5de0de4be642540b8579d4f42fc6a8801c31fb30: Status 404 returned error can't find the container with id 54e25cc9ee53caaf678ff93c5de0de4be642540b8579d4f42fc6a8801c31fb30 Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.206898 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerStarted","Data":"54e25cc9ee53caaf678ff93c5de0de4be642540b8579d4f42fc6a8801c31fb30"} Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.216832 4776 generic.go:334] "Generic (PLEG): container finished" podID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerID="c61e64ef4fba8d4dc016ca435eaf294bbe559e959d6d04b739145aa8223d76ba" exitCode=0 Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.216905 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrzln" event={"ID":"b1d11071-0dee-4b5f-989e-36b89f0eb26f","Type":"ContainerDied","Data":"c61e64ef4fba8d4dc016ca435eaf294bbe559e959d6d04b739145aa8223d76ba"} Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.228989 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:56 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:56 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:56 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.229040 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.231720 4776 generic.go:334] "Generic (PLEG): container finished" podID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerID="ad6c47871cab369b36405a67afaf242fb9ada6ffc16e8102bd3863708dedda96" exitCode=0 Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.231796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcbwh" event={"ID":"740917a7-9acc-4dbd-8d31-329cfd0538b3","Type":"ContainerDied","Data":"ad6c47871cab369b36405a67afaf242fb9ada6ffc16e8102bd3863708dedda96"} Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.236509 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerID="30a7f4fdc2694e6b429705e2425419df0c978072508fe674cdf2f2c37bc6e5f5" exitCode=0 Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.236582 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerDied","Data":"30a7f4fdc2694e6b429705e2425419df0c978072508fe674cdf2f2c37bc6e5f5"} Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.236698 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerStarted","Data":"829dad834862ac28a0a471ed252b6237fa4fe9a8200ade4479b3fb8c7f2a56fe"} Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.357959 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 09:41:56 crc kubenswrapper[4776]: W1204 09:41:56.368318 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod99ce50b0_2d5c_48d8_8348_f2ff58ea9906.slice/crio-580aefab591a5a71b3ba1d2e0a45aeb9d00f65ce40fde5eebaf0876a4ff5f8c0 WatchSource:0}: Error finding container 580aefab591a5a71b3ba1d2e0a45aeb9d00f65ce40fde5eebaf0876a4ff5f8c0: Status 404 returned error can't find the container with id 580aefab591a5a71b3ba1d2e0a45aeb9d00f65ce40fde5eebaf0876a4ff5f8c0 Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.440188 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.442522 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.447002 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.448108 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.448750 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.580420 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.580462 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.682103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.682160 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.682255 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.705212 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:56 crc kubenswrapper[4776]: I1204 09:41:56.765894 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.170717 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.212668 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:57 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:57 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:57 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.212751 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.252563 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"99ce50b0-2d5c-48d8-8348-f2ff58ea9906","Type":"ContainerStarted","Data":"15f0b0ff97d55356b7c4040689045c58bd2a9e25cfae1b617af1ecb22d2a7c09"} Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.253178 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"99ce50b0-2d5c-48d8-8348-f2ff58ea9906","Type":"ContainerStarted","Data":"580aefab591a5a71b3ba1d2e0a45aeb9d00f65ce40fde5eebaf0876a4ff5f8c0"} Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.258124 4776 generic.go:334] "Generic (PLEG): container finished" podID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerID="90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220" exitCode=0 Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.258180 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerDied","Data":"90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220"} Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.263611 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edc2672d-2dec-4495-b2cb-cb2f5412d36a","Type":"ContainerStarted","Data":"b4811b493ae8809b8238e44fa3168dd05d30a47e365bbf02c32dac430558770d"} Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.277510 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.277482436 podStartE2EDuration="2.277482436s" podCreationTimestamp="2025-12-04 09:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:57.27377145 +0000 UTC m=+162.140251827" watchObservedRunningTime="2025-12-04 09:41:57.277482436 +0000 UTC m=+162.143962813" Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.806051 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:57 crc kubenswrapper[4776]: I1204 09:41:57.825856 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cca4979-0471-4a2c-97ca-b6ec6fdd935d-metrics-certs\") pod \"network-metrics-daemon-g5jzd\" (UID: \"5cca4979-0471-4a2c-97ca-b6ec6fdd935d\") " pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.094789 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g5jzd" Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.211777 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:58 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:58 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:58 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.211900 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.273627 4776 generic.go:334] "Generic (PLEG): container finished" podID="99ce50b0-2d5c-48d8-8348-f2ff58ea9906" containerID="15f0b0ff97d55356b7c4040689045c58bd2a9e25cfae1b617af1ecb22d2a7c09" exitCode=0 Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.273683 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"99ce50b0-2d5c-48d8-8348-f2ff58ea9906","Type":"ContainerDied","Data":"15f0b0ff97d55356b7c4040689045c58bd2a9e25cfae1b617af1ecb22d2a7c09"} Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.511840 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g5jzd"] Dec 04 09:41:58 crc kubenswrapper[4776]: W1204 09:41:58.529814 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cca4979_0471_4a2c_97ca_b6ec6fdd935d.slice/crio-b3adfaab7a358712b53378150aa5f2a5e57ecbdd61310a680b10f2ae79463331 WatchSource:0}: Error finding container b3adfaab7a358712b53378150aa5f2a5e57ecbdd61310a680b10f2ae79463331: Status 404 returned error can't find the container with id b3adfaab7a358712b53378150aa5f2a5e57ecbdd61310a680b10f2ae79463331 Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.675463 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:58 crc kubenswrapper[4776]: I1204 09:41:58.682496 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vcljg" Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.225575 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:41:59 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:41:59 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:41:59 crc kubenswrapper[4776]: healthz check failed Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.225698 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.355475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edc2672d-2dec-4495-b2cb-cb2f5412d36a","Type":"ContainerStarted","Data":"32d7961b89c8b190415a8cef63af450079f7bc5cef14bb9af312738b1f63de0c"} Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.404980 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" event={"ID":"5cca4979-0471-4a2c-97ca-b6ec6fdd935d","Type":"ContainerStarted","Data":"b3adfaab7a358712b53378150aa5f2a5e57ecbdd61310a680b10f2ae79463331"} Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.580326 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rwkcl" Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.598671 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.598643591 podStartE2EDuration="3.598643591s" podCreationTimestamp="2025-12-04 09:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:41:59.39367145 +0000 UTC m=+164.260151827" watchObservedRunningTime="2025-12-04 09:41:59.598643591 +0000 UTC m=+164.465123978" Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.929113 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.975648 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kubelet-dir\") pod \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.975726 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kube-api-access\") pod \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\" (UID: \"99ce50b0-2d5c-48d8-8348-f2ff58ea9906\") " Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.975802 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "99ce50b0-2d5c-48d8-8348-f2ff58ea9906" (UID: "99ce50b0-2d5c-48d8-8348-f2ff58ea9906"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:41:59 crc kubenswrapper[4776]: I1204 09:41:59.976034 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.008093 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "99ce50b0-2d5c-48d8-8348-f2ff58ea9906" (UID: "99ce50b0-2d5c-48d8-8348-f2ff58ea9906"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.077543 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99ce50b0-2d5c-48d8-8348-f2ff58ea9906-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.214819 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:42:00 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:42:00 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:42:00 crc kubenswrapper[4776]: healthz check failed Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.214932 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.430492 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"99ce50b0-2d5c-48d8-8348-f2ff58ea9906","Type":"ContainerDied","Data":"580aefab591a5a71b3ba1d2e0a45aeb9d00f65ce40fde5eebaf0876a4ff5f8c0"} Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.432804 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="580aefab591a5a71b3ba1d2e0a45aeb9d00f65ce40fde5eebaf0876a4ff5f8c0" Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.430534 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 09:42:00 crc kubenswrapper[4776]: I1204 09:42:00.450308 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" event={"ID":"5cca4979-0471-4a2c-97ca-b6ec6fdd935d","Type":"ContainerStarted","Data":"3abec41f59028c40ab07910aff106e2c7dedd070d9d6b2d09735248fa717d7c3"} Dec 04 09:42:01 crc kubenswrapper[4776]: I1204 09:42:01.212059 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:42:01 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:42:01 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:42:01 crc kubenswrapper[4776]: healthz check failed Dec 04 09:42:01 crc kubenswrapper[4776]: I1204 09:42:01.212130 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:42:02 crc kubenswrapper[4776]: I1204 09:42:02.210633 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:42:02 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:42:02 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:42:02 crc kubenswrapper[4776]: healthz check failed Dec 04 09:42:02 crc kubenswrapper[4776]: I1204 09:42:02.210709 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.102307 4776 patch_prober.go:28] interesting pod/console-f9d7485db-vm645 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.102664 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vm645" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.210759 4776 patch_prober.go:28] interesting pod/router-default-5444994796-mt567 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 09:42:03 crc kubenswrapper[4776]: [-]has-synced failed: reason withheld Dec 04 09:42:03 crc kubenswrapper[4776]: [+]process-running ok Dec 04 09:42:03 crc kubenswrapper[4776]: healthz check failed Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.210836 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mt567" podUID="de616b95-4db7-46d2-99bd-1f9cabddcb71" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.482705 4776 generic.go:334] "Generic (PLEG): container finished" podID="edc2672d-2dec-4495-b2cb-cb2f5412d36a" containerID="32d7961b89c8b190415a8cef63af450079f7bc5cef14bb9af312738b1f63de0c" exitCode=0 Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.482774 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edc2672d-2dec-4495-b2cb-cb2f5412d36a","Type":"ContainerDied","Data":"32d7961b89c8b190415a8cef63af450079f7bc5cef14bb9af312738b1f63de0c"} Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.649939 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.650003 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.650013 4776 patch_prober.go:28] interesting pod/downloads-7954f5f757-k9crr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 09:42:03 crc kubenswrapper[4776]: I1204 09:42:03.650092 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k9crr" podUID="7909bf76-0bc7-49e8-8711-f7229c71b3eb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.211633 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.214455 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mt567" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.834615 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.861907 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kubelet-dir\") pod \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.862155 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kube-api-access\") pod \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\" (UID: \"edc2672d-2dec-4495-b2cb-cb2f5412d36a\") " Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.862182 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "edc2672d-2dec-4495-b2cb-cb2f5412d36a" (UID: "edc2672d-2dec-4495-b2cb-cb2f5412d36a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.862481 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.871208 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "edc2672d-2dec-4495-b2cb-cb2f5412d36a" (UID: "edc2672d-2dec-4495-b2cb-cb2f5412d36a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:04 crc kubenswrapper[4776]: I1204 09:42:04.963612 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edc2672d-2dec-4495-b2cb-cb2f5412d36a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:05 crc kubenswrapper[4776]: I1204 09:42:05.505558 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 09:42:05 crc kubenswrapper[4776]: I1204 09:42:05.505582 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edc2672d-2dec-4495-b2cb-cb2f5412d36a","Type":"ContainerDied","Data":"b4811b493ae8809b8238e44fa3168dd05d30a47e365bbf02c32dac430558770d"} Dec 04 09:42:05 crc kubenswrapper[4776]: I1204 09:42:05.505684 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4811b493ae8809b8238e44fa3168dd05d30a47e365bbf02c32dac430558770d" Dec 04 09:42:05 crc kubenswrapper[4776]: I1204 09:42:05.512218 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g5jzd" event={"ID":"5cca4979-0471-4a2c-97ca-b6ec6fdd935d","Type":"ContainerStarted","Data":"7b0fd691d83b058bd302ff6fe8e42d6c97708c2b0cd9123cc7577f7643a569d5"} Dec 04 09:42:05 crc kubenswrapper[4776]: I1204 09:42:05.543180 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g5jzd" podStartSLOduration=150.543152981 podStartE2EDuration="2m30.543152981s" podCreationTimestamp="2025-12-04 09:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:42:05.538053763 +0000 UTC m=+170.404534150" watchObservedRunningTime="2025-12-04 09:42:05.543152981 +0000 UTC m=+170.409633358" Dec 04 09:42:13 crc kubenswrapper[4776]: I1204 09:42:13.105192 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:42:13 crc kubenswrapper[4776]: I1204 09:42:13.109877 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:42:13 crc kubenswrapper[4776]: I1204 09:42:13.467989 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:42:13 crc kubenswrapper[4776]: I1204 09:42:13.655022 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-k9crr" Dec 04 09:42:19 crc kubenswrapper[4776]: I1204 09:42:19.380579 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:42:19 crc kubenswrapper[4776]: I1204 09:42:19.381141 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:42:23 crc kubenswrapper[4776]: I1204 09:42:23.819340 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hjpj" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.629629 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 09:42:30 crc kubenswrapper[4776]: E1204 09:42:30.630670 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ce50b0-2d5c-48d8-8348-f2ff58ea9906" containerName="pruner" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.630694 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ce50b0-2d5c-48d8-8348-f2ff58ea9906" containerName="pruner" Dec 04 09:42:30 crc kubenswrapper[4776]: E1204 09:42:30.630711 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc2672d-2dec-4495-b2cb-cb2f5412d36a" containerName="pruner" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.630723 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc2672d-2dec-4495-b2cb-cb2f5412d36a" containerName="pruner" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.630897 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ce50b0-2d5c-48d8-8348-f2ff58ea9906" containerName="pruner" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.630947 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc2672d-2dec-4495-b2cb-cb2f5412d36a" containerName="pruner" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.631583 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.636403 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.636599 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.647186 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.768478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13065ab-b545-4754-93a6-80fef65a37f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.768616 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c13065ab-b545-4754-93a6-80fef65a37f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.870841 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13065ab-b545-4754-93a6-80fef65a37f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.870988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c13065ab-b545-4754-93a6-80fef65a37f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.871124 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13065ab-b545-4754-93a6-80fef65a37f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.895564 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c13065ab-b545-4754-93a6-80fef65a37f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:30 crc kubenswrapper[4776]: I1204 09:42:30.951970 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:31 crc kubenswrapper[4776]: E1204 09:42:31.148542 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 09:42:31 crc kubenswrapper[4776]: E1204 09:42:31.148776 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x48tq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kdjvq_openshift-marketplace(0f7b6e89-f248-4b5e-82ed-809ef10f018e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:31 crc kubenswrapper[4776]: E1204 09:42:31.150090 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kdjvq" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.431670 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.435599 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.451084 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.458974 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.459233 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-var-lock\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.459321 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kube-api-access\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.561041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-var-lock\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.561090 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kube-api-access\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.561140 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.561183 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-var-lock\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.561234 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.607235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kube-api-access\") pod \"installer-9-crc\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:36 crc kubenswrapper[4776]: E1204 09:42:36.761039 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kdjvq" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" Dec 04 09:42:36 crc kubenswrapper[4776]: I1204 09:42:36.765070 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:42:37 crc kubenswrapper[4776]: E1204 09:42:37.716666 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 09:42:37 crc kubenswrapper[4776]: E1204 09:42:37.717270 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngcxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jgwk9_openshift-marketplace(5e9d9dc3-bdee-489a-9e89-3fc75bd3025e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:37 crc kubenswrapper[4776]: E1204 09:42:37.718542 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jgwk9" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" Dec 04 09:42:39 crc kubenswrapper[4776]: E1204 09:42:39.871233 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jgwk9" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" Dec 04 09:42:39 crc kubenswrapper[4776]: E1204 09:42:39.998949 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 09:42:39 crc kubenswrapper[4776]: E1204 09:42:39.999484 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xs2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4sbfh_openshift-marketplace(78f24e33-4605-4ded-98fc-83e96aa46b09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.000739 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4sbfh" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.044849 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.045129 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srv4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-j2289_openshift-marketplace(719304d2-2416-40be-b76a-ca884c683161): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.047163 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-j2289" podUID="719304d2-2416-40be-b76a-ca884c683161" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.072754 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.072999 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cjh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bhnrh_openshift-marketplace(e9c47789-6fcb-4f3d-9b38-99643a8fe1a2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.074305 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bhnrh" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.110118 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.110331 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85g6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-m27lb_openshift-marketplace(0b861681-465a-4c51-8663-ecd652c7c7b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:40 crc kubenswrapper[4776]: E1204 09:42:40.112120 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-m27lb" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.092803 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bhnrh" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.092833 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-m27lb" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.093145 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-j2289" podUID="719304d2-2416-40be-b76a-ca884c683161" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.158708 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.158954 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mx4rb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vrzln_openshift-marketplace(b1d11071-0dee-4b5f-989e-36b89f0eb26f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.160156 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vrzln" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.181216 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.181651 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vfmgd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dcbwh_openshift-marketplace(740917a7-9acc-4dbd-8d31-329cfd0538b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.182857 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dcbwh" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" Dec 04 09:42:41 crc kubenswrapper[4776]: I1204 09:42:41.502540 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 09:42:41 crc kubenswrapper[4776]: I1204 09:42:41.565153 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 09:42:41 crc kubenswrapper[4776]: W1204 09:42:41.581255 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf3e726c_2adf_410e_8bd5_bcf6cebb2ac2.slice/crio-ede69e6bfaae173e5c7e74e591a5d407c5eea5286cff59ce4ea8acdbc7b70a7f WatchSource:0}: Error finding container ede69e6bfaae173e5c7e74e591a5d407c5eea5286cff59ce4ea8acdbc7b70a7f: Status 404 returned error can't find the container with id ede69e6bfaae173e5c7e74e591a5d407c5eea5286cff59ce4ea8acdbc7b70a7f Dec 04 09:42:41 crc kubenswrapper[4776]: I1204 09:42:41.753383 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c13065ab-b545-4754-93a6-80fef65a37f2","Type":"ContainerStarted","Data":"73379dacee155cc5059655c04fde75f896a9b9c8ac5bee98f3510a222bcee39a"} Dec 04 09:42:41 crc kubenswrapper[4776]: I1204 09:42:41.756150 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2","Type":"ContainerStarted","Data":"ede69e6bfaae173e5c7e74e591a5d407c5eea5286cff59ce4ea8acdbc7b70a7f"} Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.757413 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dcbwh" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" Dec 04 09:42:41 crc kubenswrapper[4776]: E1204 09:42:41.759342 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vrzln" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" Dec 04 09:42:42 crc kubenswrapper[4776]: I1204 09:42:42.763086 4776 generic.go:334] "Generic (PLEG): container finished" podID="c13065ab-b545-4754-93a6-80fef65a37f2" containerID="8dcbc363ac32f55042c959c455dd13f1a5c0349cea5f57f37072d780a4bea0d8" exitCode=0 Dec 04 09:42:42 crc kubenswrapper[4776]: I1204 09:42:42.763155 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c13065ab-b545-4754-93a6-80fef65a37f2","Type":"ContainerDied","Data":"8dcbc363ac32f55042c959c455dd13f1a5c0349cea5f57f37072d780a4bea0d8"} Dec 04 09:42:42 crc kubenswrapper[4776]: I1204 09:42:42.764684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2","Type":"ContainerStarted","Data":"821316f5bb47998c103f5a1c410514fc20454bfc7d65a4dea87b128c7980d82c"} Dec 04 09:42:42 crc kubenswrapper[4776]: I1204 09:42:42.801282 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.8012611960000005 podStartE2EDuration="6.801261196s" podCreationTimestamp="2025-12-04 09:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:42:42.800579744 +0000 UTC m=+207.667060131" watchObservedRunningTime="2025-12-04 09:42:42.801261196 +0000 UTC m=+207.667741563" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.066116 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.186394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c13065ab-b545-4754-93a6-80fef65a37f2-kube-api-access\") pod \"c13065ab-b545-4754-93a6-80fef65a37f2\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.186562 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13065ab-b545-4754-93a6-80fef65a37f2-kubelet-dir\") pod \"c13065ab-b545-4754-93a6-80fef65a37f2\" (UID: \"c13065ab-b545-4754-93a6-80fef65a37f2\") " Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.186832 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13065ab-b545-4754-93a6-80fef65a37f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c13065ab-b545-4754-93a6-80fef65a37f2" (UID: "c13065ab-b545-4754-93a6-80fef65a37f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.193841 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13065ab-b545-4754-93a6-80fef65a37f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c13065ab-b545-4754-93a6-80fef65a37f2" (UID: "c13065ab-b545-4754-93a6-80fef65a37f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.288228 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13065ab-b545-4754-93a6-80fef65a37f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.288629 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c13065ab-b545-4754-93a6-80fef65a37f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.779597 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c13065ab-b545-4754-93a6-80fef65a37f2","Type":"ContainerDied","Data":"73379dacee155cc5059655c04fde75f896a9b9c8ac5bee98f3510a222bcee39a"} Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.779644 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73379dacee155cc5059655c04fde75f896a9b9c8ac5bee98f3510a222bcee39a" Dec 04 09:42:44 crc kubenswrapper[4776]: I1204 09:42:44.779732 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.380317 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.380893 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.380979 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.381636 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.381766 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500" gracePeriod=600 Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.808866 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500" exitCode=0 Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.809092 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500"} Dec 04 09:42:49 crc kubenswrapper[4776]: I1204 09:42:49.809348 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99"} Dec 04 09:42:51 crc kubenswrapper[4776]: I1204 09:42:51.824327 4776 generic.go:334] "Generic (PLEG): container finished" podID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerID="a00cdd6f2deb49a92c5b8449632a8630e910ea9d11c93b61a9dc1198b892e35f" exitCode=0 Dec 04 09:42:51 crc kubenswrapper[4776]: I1204 09:42:51.824419 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdjvq" event={"ID":"0f7b6e89-f248-4b5e-82ed-809ef10f018e","Type":"ContainerDied","Data":"a00cdd6f2deb49a92c5b8449632a8630e910ea9d11c93b61a9dc1198b892e35f"} Dec 04 09:42:52 crc kubenswrapper[4776]: I1204 09:42:52.037865 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-94lk2"] Dec 04 09:42:52 crc kubenswrapper[4776]: I1204 09:42:52.833250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerStarted","Data":"da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e"} Dec 04 09:42:52 crc kubenswrapper[4776]: I1204 09:42:52.835700 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdjvq" event={"ID":"0f7b6e89-f248-4b5e-82ed-809ef10f018e","Type":"ContainerStarted","Data":"0d073f2dd3e4a680a1adc100cc6a4ddad8782c436b8d7b8f2924380065638dcc"} Dec 04 09:42:52 crc kubenswrapper[4776]: I1204 09:42:52.877622 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdjvq" podStartSLOduration=3.352807479 podStartE2EDuration="1m1.877598058s" podCreationTimestamp="2025-12-04 09:41:51 +0000 UTC" firstStartedPulling="2025-12-04 09:41:54.025138849 +0000 UTC m=+158.891619226" lastFinishedPulling="2025-12-04 09:42:52.549916268 +0000 UTC m=+217.416409805" observedRunningTime="2025-12-04 09:42:52.874168694 +0000 UTC m=+217.740649071" watchObservedRunningTime="2025-12-04 09:42:52.877598058 +0000 UTC m=+217.744078425" Dec 04 09:42:53 crc kubenswrapper[4776]: I1204 09:42:53.842643 4776 generic.go:334] "Generic (PLEG): container finished" podID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerID="9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce" exitCode=0 Dec 04 09:42:53 crc kubenswrapper[4776]: I1204 09:42:53.842842 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhnrh" event={"ID":"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2","Type":"ContainerDied","Data":"9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce"} Dec 04 09:42:53 crc kubenswrapper[4776]: I1204 09:42:53.847656 4776 generic.go:334] "Generic (PLEG): container finished" podID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerID="da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e" exitCode=0 Dec 04 09:42:53 crc kubenswrapper[4776]: I1204 09:42:53.847715 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerDied","Data":"da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e"} Dec 04 09:42:53 crc kubenswrapper[4776]: I1204 09:42:53.849898 4776 generic.go:334] "Generic (PLEG): container finished" podID="719304d2-2416-40be-b76a-ca884c683161" containerID="0cc88e1a4fe819a4e16c47e47c63cccc8e674d2e250986cb902126922df77b96" exitCode=0 Dec 04 09:42:53 crc kubenswrapper[4776]: I1204 09:42:53.849937 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2289" event={"ID":"719304d2-2416-40be-b76a-ca884c683161","Type":"ContainerDied","Data":"0cc88e1a4fe819a4e16c47e47c63cccc8e674d2e250986cb902126922df77b96"} Dec 04 09:42:54 crc kubenswrapper[4776]: I1204 09:42:54.859000 4776 generic.go:334] "Generic (PLEG): container finished" podID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerID="9737a7bb927ed12b4315e23542e873e33f501e771ad00067da2edd2718ecfe34" exitCode=0 Dec 04 09:42:54 crc kubenswrapper[4776]: I1204 09:42:54.859233 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrzln" event={"ID":"b1d11071-0dee-4b5f-989e-36b89f0eb26f","Type":"ContainerDied","Data":"9737a7bb927ed12b4315e23542e873e33f501e771ad00067da2edd2718ecfe34"} Dec 04 09:42:54 crc kubenswrapper[4776]: I1204 09:42:54.862447 4776 generic.go:334] "Generic (PLEG): container finished" podID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerID="feeac19ac20f1f4516106837873c975e2b8c3f11a1904a21f7338f72be106539" exitCode=0 Dec 04 09:42:54 crc kubenswrapper[4776]: I1204 09:42:54.862487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcbwh" event={"ID":"740917a7-9acc-4dbd-8d31-329cfd0538b3","Type":"ContainerDied","Data":"feeac19ac20f1f4516106837873c975e2b8c3f11a1904a21f7338f72be106539"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.871016 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerStarted","Data":"4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.916097 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2289" event={"ID":"719304d2-2416-40be-b76a-ca884c683161","Type":"ContainerStarted","Data":"7684e23bc7cd3d6c1fb96eb424902f659852d4e9dbb8c437674156f61d0254d2"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.918496 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhnrh" event={"ID":"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2","Type":"ContainerStarted","Data":"97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.923066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerStarted","Data":"5ab5217f894570f3f880c012308f2c8bd416353684f7701ae186129acb55069e"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.926881 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jgwk9" podStartSLOduration=3.5474344220000003 podStartE2EDuration="1m0.92683484s" podCreationTimestamp="2025-12-04 09:41:55 +0000 UTC" firstStartedPulling="2025-12-04 09:41:57.264411301 +0000 UTC m=+162.130891678" lastFinishedPulling="2025-12-04 09:42:54.643811719 +0000 UTC m=+219.510292096" observedRunningTime="2025-12-04 09:42:55.921890128 +0000 UTC m=+220.788370515" watchObservedRunningTime="2025-12-04 09:42:55.92683484 +0000 UTC m=+220.793315207" Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.937800 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrzln" event={"ID":"b1d11071-0dee-4b5f-989e-36b89f0eb26f","Type":"ContainerStarted","Data":"79316d230ff1fb6fae0646366771fc060367b6f3c611122ae14424a8e5099834"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.946953 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bhnrh" podStartSLOduration=5.078203437 podStartE2EDuration="1m4.946930148s" podCreationTimestamp="2025-12-04 09:41:51 +0000 UTC" firstStartedPulling="2025-12-04 09:41:55.139747175 +0000 UTC m=+160.006227552" lastFinishedPulling="2025-12-04 09:42:55.008473886 +0000 UTC m=+219.874954263" observedRunningTime="2025-12-04 09:42:55.945657798 +0000 UTC m=+220.812138195" watchObservedRunningTime="2025-12-04 09:42:55.946930148 +0000 UTC m=+220.813410515" Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.947117 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcbwh" event={"ID":"740917a7-9acc-4dbd-8d31-329cfd0538b3","Type":"ContainerStarted","Data":"1da4a793b43f0ff4206963476899e090395c3825093e7c88af192bb20dce822f"} Dec 04 09:42:55 crc kubenswrapper[4776]: I1204 09:42:55.965853 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j2289" podStartSLOduration=5.322483816 podStartE2EDuration="1m4.965830718s" podCreationTimestamp="2025-12-04 09:41:51 +0000 UTC" firstStartedPulling="2025-12-04 09:41:55.111521731 +0000 UTC m=+159.978002098" lastFinishedPulling="2025-12-04 09:42:54.754868623 +0000 UTC m=+219.621349000" observedRunningTime="2025-12-04 09:42:55.964007993 +0000 UTC m=+220.830488390" watchObservedRunningTime="2025-12-04 09:42:55.965830718 +0000 UTC m=+220.832311085" Dec 04 09:42:56 crc kubenswrapper[4776]: I1204 09:42:56.022393 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dcbwh" podStartSLOduration=4.007620676 podStartE2EDuration="1m3.022366146s" podCreationTimestamp="2025-12-04 09:41:53 +0000 UTC" firstStartedPulling="2025-12-04 09:41:56.234785915 +0000 UTC m=+161.101266292" lastFinishedPulling="2025-12-04 09:42:55.249531385 +0000 UTC m=+220.116011762" observedRunningTime="2025-12-04 09:42:55.98510056 +0000 UTC m=+220.851580957" watchObservedRunningTime="2025-12-04 09:42:56.022366146 +0000 UTC m=+220.888846523" Dec 04 09:42:56 crc kubenswrapper[4776]: I1204 09:42:56.051934 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vrzln" podStartSLOduration=3.90471264 podStartE2EDuration="1m3.051899493s" podCreationTimestamp="2025-12-04 09:41:53 +0000 UTC" firstStartedPulling="2025-12-04 09:41:56.219310976 +0000 UTC m=+161.085791353" lastFinishedPulling="2025-12-04 09:42:55.366497819 +0000 UTC m=+220.232978206" observedRunningTime="2025-12-04 09:42:56.047126067 +0000 UTC m=+220.913606444" watchObservedRunningTime="2025-12-04 09:42:56.051899493 +0000 UTC m=+220.918379870" Dec 04 09:42:56 crc kubenswrapper[4776]: I1204 09:42:56.958127 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerStarted","Data":"5c963b57f49bb60f960e521d3ab80fdda70adf881ba59a13c427cccae4ee07df"} Dec 04 09:42:56 crc kubenswrapper[4776]: I1204 09:42:56.967682 4776 generic.go:334] "Generic (PLEG): container finished" podID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerID="5ab5217f894570f3f880c012308f2c8bd416353684f7701ae186129acb55069e" exitCode=0 Dec 04 09:42:56 crc kubenswrapper[4776]: I1204 09:42:56.967742 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerDied","Data":"5ab5217f894570f3f880c012308f2c8bd416353684f7701ae186129acb55069e"} Dec 04 09:42:56 crc kubenswrapper[4776]: I1204 09:42:56.967777 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerStarted","Data":"04bbb21dcf532325ae870bb41f447332dde3e1edecb3754de8e66979bc2a6fad"} Dec 04 09:42:57 crc kubenswrapper[4776]: I1204 09:42:57.975939 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerID="5c963b57f49bb60f960e521d3ab80fdda70adf881ba59a13c427cccae4ee07df" exitCode=0 Dec 04 09:42:57 crc kubenswrapper[4776]: I1204 09:42:57.975960 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerDied","Data":"5c963b57f49bb60f960e521d3ab80fdda70adf881ba59a13c427cccae4ee07df"} Dec 04 09:42:57 crc kubenswrapper[4776]: I1204 09:42:57.995645 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4sbfh" podStartSLOduration=5.785019759 podStartE2EDuration="1m5.995620699s" podCreationTimestamp="2025-12-04 09:41:52 +0000 UTC" firstStartedPulling="2025-12-04 09:41:56.246758125 +0000 UTC m=+161.113238492" lastFinishedPulling="2025-12-04 09:42:56.457359055 +0000 UTC m=+221.323839432" observedRunningTime="2025-12-04 09:42:57.008987968 +0000 UTC m=+221.875468365" watchObservedRunningTime="2025-12-04 09:42:57.995620699 +0000 UTC m=+222.862101076" Dec 04 09:42:59 crc kubenswrapper[4776]: I1204 09:42:59.989661 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerStarted","Data":"56b0ccf9a6ed6f4bff8d89041564e5f74d64b2a22d5ac919e087cdc41100ebbe"} Dec 04 09:43:00 crc kubenswrapper[4776]: I1204 09:43:00.010621 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m27lb" podStartSLOduration=3.293830373 podStartE2EDuration="1m6.010599885s" podCreationTimestamp="2025-12-04 09:41:54 +0000 UTC" firstStartedPulling="2025-12-04 09:41:56.246118195 +0000 UTC m=+161.112598572" lastFinishedPulling="2025-12-04 09:42:58.962887707 +0000 UTC m=+223.829368084" observedRunningTime="2025-12-04 09:43:00.006985194 +0000 UTC m=+224.873465571" watchObservedRunningTime="2025-12-04 09:43:00.010599885 +0000 UTC m=+224.877080262" Dec 04 09:43:01 crc kubenswrapper[4776]: I1204 09:43:01.701397 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:43:01 crc kubenswrapper[4776]: I1204 09:43:01.701708 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:43:01 crc kubenswrapper[4776]: I1204 09:43:01.780494 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.036657 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.085997 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.086411 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.136632 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.340626 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.340682 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.395935 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.614098 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:43:02 crc kubenswrapper[4776]: I1204 09:43:02.614181 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:43:03 crc kubenswrapper[4776]: I1204 09:43:03.040629 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:43:03 crc kubenswrapper[4776]: I1204 09:43:03.046779 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:43:03 crc kubenswrapper[4776]: I1204 09:43:03.551697 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:43:03 crc kubenswrapper[4776]: I1204 09:43:03.593573 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.148370 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.150053 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.195110 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.356252 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.356959 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.402000 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:43:04 crc kubenswrapper[4776]: I1204 09:43:04.493622 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdjvq"] Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.019506 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdjvq" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="registry-server" containerID="cri-o://0d073f2dd3e4a680a1adc100cc6a4ddad8782c436b8d7b8f2924380065638dcc" gracePeriod=2 Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.064639 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.067465 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.092156 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sbfh"] Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.092429 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4sbfh" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="registry-server" containerID="cri-o://04bbb21dcf532325ae870bb41f447332dde3e1edecb3754de8e66979bc2a6fad" gracePeriod=2 Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.279957 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.280316 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.324692 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.739734 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.740699 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:43:05 crc kubenswrapper[4776]: I1204 09:43:05.782453 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:43:06 crc kubenswrapper[4776]: I1204 09:43:06.061786 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:43:06 crc kubenswrapper[4776]: I1204 09:43:06.065060 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:43:06 crc kubenswrapper[4776]: I1204 09:43:06.888727 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcbwh"] Dec 04 09:43:07 crc kubenswrapper[4776]: E1204 09:43:07.483896 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f24e33_4605_4ded_98fc_83e96aa46b09.slice/crio-conmon-04bbb21dcf532325ae870bb41f447332dde3e1edecb3754de8e66979bc2a6fad.scope\": RecentStats: unable to find data in memory cache]" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.039130 4776 generic.go:334] "Generic (PLEG): container finished" podID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerID="0d073f2dd3e4a680a1adc100cc6a4ddad8782c436b8d7b8f2924380065638dcc" exitCode=0 Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.039204 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdjvq" event={"ID":"0f7b6e89-f248-4b5e-82ed-809ef10f018e","Type":"ContainerDied","Data":"0d073f2dd3e4a680a1adc100cc6a4ddad8782c436b8d7b8f2924380065638dcc"} Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.041007 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4sbfh_78f24e33-4605-4ded-98fc-83e96aa46b09/registry-server/0.log" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.041695 4776 generic.go:334] "Generic (PLEG): container finished" podID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerID="04bbb21dcf532325ae870bb41f447332dde3e1edecb3754de8e66979bc2a6fad" exitCode=137 Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.041715 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerDied","Data":"04bbb21dcf532325ae870bb41f447332dde3e1edecb3754de8e66979bc2a6fad"} Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.042178 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dcbwh" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="registry-server" containerID="cri-o://1da4a793b43f0ff4206963476899e090395c3825093e7c88af192bb20dce822f" gracePeriod=2 Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.515464 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4sbfh_78f24e33-4605-4ded-98fc-83e96aa46b09/registry-server/0.log" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.516600 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.561418 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-catalog-content\") pod \"78f24e33-4605-4ded-98fc-83e96aa46b09\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.561518 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xs2d\" (UniqueName: \"kubernetes.io/projected/78f24e33-4605-4ded-98fc-83e96aa46b09-kube-api-access-4xs2d\") pod \"78f24e33-4605-4ded-98fc-83e96aa46b09\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.561621 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-utilities\") pod \"78f24e33-4605-4ded-98fc-83e96aa46b09\" (UID: \"78f24e33-4605-4ded-98fc-83e96aa46b09\") " Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.562530 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-utilities" (OuterVolumeSpecName: "utilities") pod "78f24e33-4605-4ded-98fc-83e96aa46b09" (UID: "78f24e33-4605-4ded-98fc-83e96aa46b09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.563164 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.572076 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f24e33-4605-4ded-98fc-83e96aa46b09-kube-api-access-4xs2d" (OuterVolumeSpecName: "kube-api-access-4xs2d") pod "78f24e33-4605-4ded-98fc-83e96aa46b09" (UID: "78f24e33-4605-4ded-98fc-83e96aa46b09"). InnerVolumeSpecName "kube-api-access-4xs2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.610976 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78f24e33-4605-4ded-98fc-83e96aa46b09" (UID: "78f24e33-4605-4ded-98fc-83e96aa46b09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.664771 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f24e33-4605-4ded-98fc-83e96aa46b09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.664811 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xs2d\" (UniqueName: \"kubernetes.io/projected/78f24e33-4605-4ded-98fc-83e96aa46b09-kube-api-access-4xs2d\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.724277 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.765668 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-utilities\") pod \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.765817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-catalog-content\") pod \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.765854 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48tq\" (UniqueName: \"kubernetes.io/projected/0f7b6e89-f248-4b5e-82ed-809ef10f018e-kube-api-access-x48tq\") pod \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\" (UID: \"0f7b6e89-f248-4b5e-82ed-809ef10f018e\") " Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.766565 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-utilities" (OuterVolumeSpecName: "utilities") pod "0f7b6e89-f248-4b5e-82ed-809ef10f018e" (UID: "0f7b6e89-f248-4b5e-82ed-809ef10f018e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.768721 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f7b6e89-f248-4b5e-82ed-809ef10f018e-kube-api-access-x48tq" (OuterVolumeSpecName: "kube-api-access-x48tq") pod "0f7b6e89-f248-4b5e-82ed-809ef10f018e" (UID: "0f7b6e89-f248-4b5e-82ed-809ef10f018e"). InnerVolumeSpecName "kube-api-access-x48tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.816526 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f7b6e89-f248-4b5e-82ed-809ef10f018e" (UID: "0f7b6e89-f248-4b5e-82ed-809ef10f018e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.868006 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.868046 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48tq\" (UniqueName: \"kubernetes.io/projected/0f7b6e89-f248-4b5e-82ed-809ef10f018e-kube-api-access-x48tq\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:08 crc kubenswrapper[4776]: I1204 09:43:08.868057 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7b6e89-f248-4b5e-82ed-809ef10f018e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.050005 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdjvq" event={"ID":"0f7b6e89-f248-4b5e-82ed-809ef10f018e","Type":"ContainerDied","Data":"510996b1f23a75c8021a1c23e95116231af855c550fd1a0f202ce7487db5107d"} Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.050053 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdjvq" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.050111 4776 scope.go:117] "RemoveContainer" containerID="0d073f2dd3e4a680a1adc100cc6a4ddad8782c436b8d7b8f2924380065638dcc" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.052820 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4sbfh_78f24e33-4605-4ded-98fc-83e96aa46b09/registry-server/0.log" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.054665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sbfh" event={"ID":"78f24e33-4605-4ded-98fc-83e96aa46b09","Type":"ContainerDied","Data":"d3e09d6473adad5d26a6fd3f5d38b66e62bdacd9fcc49e369afb96ce6d5eb244"} Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.054712 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sbfh" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.058753 4776 generic.go:334] "Generic (PLEG): container finished" podID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerID="1da4a793b43f0ff4206963476899e090395c3825093e7c88af192bb20dce822f" exitCode=0 Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.058808 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcbwh" event={"ID":"740917a7-9acc-4dbd-8d31-329cfd0538b3","Type":"ContainerDied","Data":"1da4a793b43f0ff4206963476899e090395c3825093e7c88af192bb20dce822f"} Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.067594 4776 scope.go:117] "RemoveContainer" containerID="a00cdd6f2deb49a92c5b8449632a8630e910ea9d11c93b61a9dc1198b892e35f" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.084840 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdjvq"] Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.094106 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdjvq"] Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.097142 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sbfh"] Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.099445 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4sbfh"] Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.101879 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.111364 4776 scope.go:117] "RemoveContainer" containerID="f69194bdf91c186b7c2756398944222d3f4830e70889fd5fce967487fe8c661b" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.128817 4776 scope.go:117] "RemoveContainer" containerID="04bbb21dcf532325ae870bb41f447332dde3e1edecb3754de8e66979bc2a6fad" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.144564 4776 scope.go:117] "RemoveContainer" containerID="5ab5217f894570f3f880c012308f2c8bd416353684f7701ae186129acb55069e" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.161486 4776 scope.go:117] "RemoveContainer" containerID="2a77b1ce84dc8fa8d318149cd943b11aa9b9f37558c0ec05a48bbd3ccdfc6cab" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.172910 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfmgd\" (UniqueName: \"kubernetes.io/projected/740917a7-9acc-4dbd-8d31-329cfd0538b3-kube-api-access-vfmgd\") pod \"740917a7-9acc-4dbd-8d31-329cfd0538b3\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.172999 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-catalog-content\") pod \"740917a7-9acc-4dbd-8d31-329cfd0538b3\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.173072 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-utilities\") pod \"740917a7-9acc-4dbd-8d31-329cfd0538b3\" (UID: \"740917a7-9acc-4dbd-8d31-329cfd0538b3\") " Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.174115 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-utilities" (OuterVolumeSpecName: "utilities") pod "740917a7-9acc-4dbd-8d31-329cfd0538b3" (UID: "740917a7-9acc-4dbd-8d31-329cfd0538b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.176709 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740917a7-9acc-4dbd-8d31-329cfd0538b3-kube-api-access-vfmgd" (OuterVolumeSpecName: "kube-api-access-vfmgd") pod "740917a7-9acc-4dbd-8d31-329cfd0538b3" (UID: "740917a7-9acc-4dbd-8d31-329cfd0538b3"). InnerVolumeSpecName "kube-api-access-vfmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.190324 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "740917a7-9acc-4dbd-8d31-329cfd0538b3" (UID: "740917a7-9acc-4dbd-8d31-329cfd0538b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.274282 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.274506 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740917a7-9acc-4dbd-8d31-329cfd0538b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.275013 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfmgd\" (UniqueName: \"kubernetes.io/projected/740917a7-9acc-4dbd-8d31-329cfd0538b3-kube-api-access-vfmgd\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.467472 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" path="/var/lib/kubelet/pods/0f7b6e89-f248-4b5e-82ed-809ef10f018e/volumes" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.468354 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" path="/var/lib/kubelet/pods/78f24e33-4605-4ded-98fc-83e96aa46b09/volumes" Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.889825 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jgwk9"] Dec 04 09:43:09 crc kubenswrapper[4776]: I1204 09:43:09.890780 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jgwk9" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="registry-server" containerID="cri-o://4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f" gracePeriod=2 Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.069804 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dcbwh" Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.069882 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dcbwh" event={"ID":"740917a7-9acc-4dbd-8d31-329cfd0538b3","Type":"ContainerDied","Data":"ac5fc97e972fb6d8d40cd7f3e9e89de0a057d96621b56ab86aed230a2b6dddc4"} Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.069961 4776 scope.go:117] "RemoveContainer" containerID="1da4a793b43f0ff4206963476899e090395c3825093e7c88af192bb20dce822f" Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.087634 4776 scope.go:117] "RemoveContainer" containerID="feeac19ac20f1f4516106837873c975e2b8c3f11a1904a21f7338f72be106539" Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.091059 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcbwh"] Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.093669 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dcbwh"] Dec 04 09:43:10 crc kubenswrapper[4776]: I1204 09:43:10.100072 4776 scope.go:117] "RemoveContainer" containerID="ad6c47871cab369b36405a67afaf242fb9ada6ffc16e8102bd3863708dedda96" Dec 04 09:43:11 crc kubenswrapper[4776]: I1204 09:43:11.459779 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" path="/var/lib/kubelet/pods/740917a7-9acc-4dbd-8d31-329cfd0538b3/volumes" Dec 04 09:43:11 crc kubenswrapper[4776]: I1204 09:43:11.922654 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.027210 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngcxf\" (UniqueName: \"kubernetes.io/projected/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-kube-api-access-ngcxf\") pod \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.027322 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-utilities\") pod \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.027376 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-catalog-content\") pod \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\" (UID: \"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e\") " Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.028581 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-utilities" (OuterVolumeSpecName: "utilities") pod "5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" (UID: "5e9d9dc3-bdee-489a-9e89-3fc75bd3025e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.031664 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-kube-api-access-ngcxf" (OuterVolumeSpecName: "kube-api-access-ngcxf") pod "5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" (UID: "5e9d9dc3-bdee-489a-9e89-3fc75bd3025e"). InnerVolumeSpecName "kube-api-access-ngcxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.090346 4776 generic.go:334] "Generic (PLEG): container finished" podID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerID="4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f" exitCode=0 Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.090403 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerDied","Data":"4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f"} Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.090412 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jgwk9" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.090440 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jgwk9" event={"ID":"5e9d9dc3-bdee-489a-9e89-3fc75bd3025e","Type":"ContainerDied","Data":"54e25cc9ee53caaf678ff93c5de0de4be642540b8579d4f42fc6a8801c31fb30"} Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.090464 4776 scope.go:117] "RemoveContainer" containerID="4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.107386 4776 scope.go:117] "RemoveContainer" containerID="da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.121387 4776 scope.go:117] "RemoveContainer" containerID="90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.128701 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngcxf\" (UniqueName: \"kubernetes.io/projected/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-kube-api-access-ngcxf\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.128746 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.135553 4776 scope.go:117] "RemoveContainer" containerID="4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f" Dec 04 09:43:12 crc kubenswrapper[4776]: E1204 09:43:12.136165 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f\": container with ID starting with 4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f not found: ID does not exist" containerID="4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.136218 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f"} err="failed to get container status \"4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f\": rpc error: code = NotFound desc = could not find container \"4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f\": container with ID starting with 4865a53a6304961e4875b1ada2b4cbba6eb52bfc296a8da8d60423016da8bc3f not found: ID does not exist" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.136258 4776 scope.go:117] "RemoveContainer" containerID="da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e" Dec 04 09:43:12 crc kubenswrapper[4776]: E1204 09:43:12.136663 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e\": container with ID starting with da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e not found: ID does not exist" containerID="da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.136711 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e"} err="failed to get container status \"da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e\": rpc error: code = NotFound desc = could not find container \"da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e\": container with ID starting with da15151de55013eb1ec8accdce10da8983322ab7439fb365f449b7aa15d0b48e not found: ID does not exist" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.136743 4776 scope.go:117] "RemoveContainer" containerID="90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220" Dec 04 09:43:12 crc kubenswrapper[4776]: E1204 09:43:12.137215 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220\": container with ID starting with 90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220 not found: ID does not exist" containerID="90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.137244 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220"} err="failed to get container status \"90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220\": rpc error: code = NotFound desc = could not find container \"90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220\": container with ID starting with 90a81eaa2c60f3b5046315e542aa6df70a8e12ee8e525bf01e7cc7b4a39cb220 not found: ID does not exist" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.141135 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" (UID: "5e9d9dc3-bdee-489a-9e89-3fc75bd3025e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.230236 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.417868 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jgwk9"] Dec 04 09:43:12 crc kubenswrapper[4776]: I1204 09:43:12.422372 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jgwk9"] Dec 04 09:43:13 crc kubenswrapper[4776]: I1204 09:43:13.458895 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" path="/var/lib/kubelet/pods/5e9d9dc3-bdee-489a-9e89-3fc75bd3025e/volumes" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.062480 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" podUID="7c916477-5fc5-43cc-b409-01e423c554a2" containerName="oauth-openshift" containerID="cri-o://e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4" gracePeriod=15 Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.423400 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507104 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-cliconfig\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507164 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-trusted-ca-bundle\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-session\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507244 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-877g5\" (UniqueName: \"kubernetes.io/projected/7c916477-5fc5-43cc-b409-01e423c554a2-kube-api-access-877g5\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507268 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-error\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507861 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.507976 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508049 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-router-certs\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c916477-5fc5-43cc-b409-01e423c554a2-audit-dir\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508099 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-ocp-branding-template\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508145 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-provider-selection\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508169 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-login\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508220 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-serving-cert\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508245 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-audit-policies\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508273 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-idp-0-file-data\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508313 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-service-ca\") pod \"7c916477-5fc5-43cc-b409-01e423c554a2\" (UID: \"7c916477-5fc5-43cc-b409-01e423c554a2\") " Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508764 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.508780 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.509319 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c916477-5fc5-43cc-b409-01e423c554a2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.509956 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.510035 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.514286 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c916477-5fc5-43cc-b409-01e423c554a2-kube-api-access-877g5" (OuterVolumeSpecName: "kube-api-access-877g5") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "kube-api-access-877g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.518070 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.518497 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.518752 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.518851 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.519196 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.519374 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.519773 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.519981 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7c916477-5fc5-43cc-b409-01e423c554a2" (UID: "7c916477-5fc5-43cc-b409-01e423c554a2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.609925 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.609980 4776 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.609993 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610005 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610016 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610029 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-877g5\" (UniqueName: \"kubernetes.io/projected/7c916477-5fc5-43cc-b409-01e423c554a2-kube-api-access-877g5\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610040 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610051 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610062 4776 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c916477-5fc5-43cc-b409-01e423c554a2-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610073 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610084 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:17 crc kubenswrapper[4776]: I1204 09:43:17.610096 4776 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7c916477-5fc5-43cc-b409-01e423c554a2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.126319 4776 generic.go:334] "Generic (PLEG): container finished" podID="7c916477-5fc5-43cc-b409-01e423c554a2" containerID="e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4" exitCode=0 Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.126375 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" event={"ID":"7c916477-5fc5-43cc-b409-01e423c554a2","Type":"ContainerDied","Data":"e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4"} Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.126407 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.126417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-94lk2" event={"ID":"7c916477-5fc5-43cc-b409-01e423c554a2","Type":"ContainerDied","Data":"b3cf98397c74aeb0b5af0433d79a2b3ae0cad7dedcb60bf39db0c7eb55ac8581"} Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.126439 4776 scope.go:117] "RemoveContainer" containerID="e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4" Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.152367 4776 scope.go:117] "RemoveContainer" containerID="e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4" Dec 04 09:43:18 crc kubenswrapper[4776]: E1204 09:43:18.154746 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4\": container with ID starting with e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4 not found: ID does not exist" containerID="e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4" Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.154797 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4"} err="failed to get container status \"e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4\": rpc error: code = NotFound desc = could not find container \"e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4\": container with ID starting with e53c2dfa8444b7a1222f9af9a91cc432342bf56da1c5e6d0182f876b6394a9d4 not found: ID does not exist" Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.166426 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-94lk2"] Dec 04 09:43:18 crc kubenswrapper[4776]: I1204 09:43:18.171955 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-94lk2"] Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.463252 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c916477-5fc5-43cc-b409-01e423c554a2" path="/var/lib/kubelet/pods/7c916477-5fc5-43cc-b409-01e423c554a2/volumes" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.582274 4776 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.583432 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd" gracePeriod=15 Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.583513 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b" gracePeriod=15 Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.583470 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b" gracePeriod=15 Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.583540 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb" gracePeriod=15 Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.583511 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774" gracePeriod=15 Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.584417 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585783 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585808 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585823 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585856 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585868 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585878 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585888 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585896 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585904 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585911 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585924 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585946 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585958 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585965 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.585973 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13065ab-b545-4754-93a6-80fef65a37f2" containerName="pruner" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.585984 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13065ab-b545-4754-93a6-80fef65a37f2" containerName="pruner" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586005 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c916477-5fc5-43cc-b409-01e423c554a2" containerName="oauth-openshift" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586013 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c916477-5fc5-43cc-b409-01e423c554a2" containerName="oauth-openshift" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586022 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586029 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586039 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586046 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586057 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586064 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586076 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586084 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586095 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586104 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586114 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586124 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586133 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586140 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586151 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586162 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586172 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586181 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="extract-content" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586192 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586200 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="extract-utilities" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586211 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586218 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586335 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586349 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586361 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9d9dc3-bdee-489a-9e89-3fc75bd3025e" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586371 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="740917a7-9acc-4dbd-8d31-329cfd0538b3" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586378 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586386 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f7b6e89-f248-4b5e-82ed-809ef10f018e" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586395 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586407 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586415 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f24e33-4605-4ded-98fc-83e96aa46b09" containerName="registry-server" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586427 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586436 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c916477-5fc5-43cc-b409-01e423c554a2" containerName="oauth-openshift" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586444 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13065ab-b545-4754-93a6-80fef65a37f2" containerName="pruner" Dec 04 09:43:19 crc kubenswrapper[4776]: E1204 09:43:19.586539 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.586546 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.587723 4776 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.588354 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.593602 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.643864 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.643923 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.643970 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.644327 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.644495 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.644591 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.644667 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.644776 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746624 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746709 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746749 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746816 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746822 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.747004 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746731 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746897 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746921 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746911 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.746844 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.747215 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.747262 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.747291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:19 crc kubenswrapper[4776]: I1204 09:43:19.747397 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.141251 4776 generic.go:334] "Generic (PLEG): container finished" podID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" containerID="821316f5bb47998c103f5a1c410514fc20454bfc7d65a4dea87b128c7980d82c" exitCode=0 Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.141346 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2","Type":"ContainerDied","Data":"821316f5bb47998c103f5a1c410514fc20454bfc7d65a4dea87b128c7980d82c"} Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.142675 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.143163 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.144223 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.144787 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b" exitCode=0 Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.144810 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b" exitCode=0 Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.144822 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774" exitCode=0 Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.144831 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb" exitCode=2 Dec 04 09:43:20 crc kubenswrapper[4776]: I1204 09:43:20.144868 4776 scope.go:117] "RemoveContainer" containerID="ca01e2b10ec948bb4ecb1640592c80dfe1f844a17864a90e828473b26516b3f4" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.152663 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.361303 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.362308 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.483597 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kube-api-access\") pod \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.483804 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-var-lock\") pod \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.483844 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kubelet-dir\") pod \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\" (UID: \"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2\") " Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.483980 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-var-lock" (OuterVolumeSpecName: "var-lock") pod "af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" (UID: "af3e726c-2adf-410e-8bd5-bcf6cebb2ac2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.484075 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" (UID: "af3e726c-2adf-410e-8bd5-bcf6cebb2ac2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.484267 4776 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.484287 4776 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.489563 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" (UID: "af3e726c-2adf-410e-8bd5-bcf6cebb2ac2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.585782 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3e726c-2adf-410e-8bd5-bcf6cebb2ac2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.953189 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.954211 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.954870 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.955177 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990058 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990171 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990195 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990377 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990676 4776 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990698 4776 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:21 crc kubenswrapper[4776]: I1204 09:43:21.990707 4776 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.163263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3e726c-2adf-410e-8bd5-bcf6cebb2ac2","Type":"ContainerDied","Data":"ede69e6bfaae173e5c7e74e591a5d407c5eea5286cff59ce4ea8acdbc7b70a7f"} Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.163343 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede69e6bfaae173e5c7e74e591a5d407c5eea5286cff59ce4ea8acdbc7b70a7f" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.163542 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.166484 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.167294 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd" exitCode=0 Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.167370 4776 scope.go:117] "RemoveContainer" containerID="865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.167414 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.179364 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.179863 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.183956 4776 scope.go:117] "RemoveContainer" containerID="b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.186020 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.186501 4776 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.196758 4776 scope.go:117] "RemoveContainer" containerID="02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.220087 4776 scope.go:117] "RemoveContainer" containerID="1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.235291 4776 scope.go:117] "RemoveContainer" containerID="2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.251046 4776 scope.go:117] "RemoveContainer" containerID="347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.268060 4776 scope.go:117] "RemoveContainer" containerID="865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b" Dec 04 09:43:22 crc kubenswrapper[4776]: E1204 09:43:22.268492 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\": container with ID starting with 865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b not found: ID does not exist" containerID="865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.268526 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b"} err="failed to get container status \"865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\": rpc error: code = NotFound desc = could not find container \"865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b\": container with ID starting with 865106cbb1fb26a2fd326eaac51fd3f6a4ce8d1f37282c6a619a2e199ed6872b not found: ID does not exist" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.268553 4776 scope.go:117] "RemoveContainer" containerID="b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b" Dec 04 09:43:22 crc kubenswrapper[4776]: E1204 09:43:22.268742 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\": container with ID starting with b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b not found: ID does not exist" containerID="b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.268766 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b"} err="failed to get container status \"b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\": rpc error: code = NotFound desc = could not find container \"b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b\": container with ID starting with b18b6d239a3d21cefea23735927c8fe719a56346d470dbdbd89be182fe1d613b not found: ID does not exist" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.268779 4776 scope.go:117] "RemoveContainer" containerID="02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774" Dec 04 09:43:22 crc kubenswrapper[4776]: E1204 09:43:22.268973 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\": container with ID starting with 02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774 not found: ID does not exist" containerID="02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269014 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774"} err="failed to get container status \"02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\": rpc error: code = NotFound desc = could not find container \"02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774\": container with ID starting with 02ec06f72e9a40a9c1e148ae9db362ae6e733ca1a45790c98750a29ce0b9c774 not found: ID does not exist" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269038 4776 scope.go:117] "RemoveContainer" containerID="1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb" Dec 04 09:43:22 crc kubenswrapper[4776]: E1204 09:43:22.269239 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\": container with ID starting with 1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb not found: ID does not exist" containerID="1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269269 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb"} err="failed to get container status \"1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\": rpc error: code = NotFound desc = could not find container \"1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb\": container with ID starting with 1354b62999a2bebbfdfc6866295b038b5f696545cc9626a858cc97c1b04cb1bb not found: ID does not exist" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269285 4776 scope.go:117] "RemoveContainer" containerID="2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd" Dec 04 09:43:22 crc kubenswrapper[4776]: E1204 09:43:22.269498 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\": container with ID starting with 2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd not found: ID does not exist" containerID="2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269532 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd"} err="failed to get container status \"2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\": rpc error: code = NotFound desc = could not find container \"2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd\": container with ID starting with 2fe8fce76cb6fe087e3d3203cf6ca01534e7e8cabe651684752250af8a5bbdbd not found: ID does not exist" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269551 4776 scope.go:117] "RemoveContainer" containerID="347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0" Dec 04 09:43:22 crc kubenswrapper[4776]: E1204 09:43:22.269810 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\": container with ID starting with 347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0 not found: ID does not exist" containerID="347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0" Dec 04 09:43:22 crc kubenswrapper[4776]: I1204 09:43:22.269837 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0"} err="failed to get container status \"347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\": rpc error: code = NotFound desc = could not find container \"347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0\": container with ID starting with 347f7085728ecfdd9301da2efc0e537de77a1e5c7fee807c52cdb0d745eb47a0 not found: ID does not exist" Dec 04 09:43:23 crc kubenswrapper[4776]: I1204 09:43:23.459801 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 04 09:43:24 crc kubenswrapper[4776]: E1204 09:43:24.621705 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.173:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:24 crc kubenswrapper[4776]: I1204 09:43:24.622412 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:24 crc kubenswrapper[4776]: E1204 09:43:24.653462 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187df9e8313d5b8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:43:24.652936078 +0000 UTC m=+249.519416465,LastTimestamp:2025-12-04 09:43:24.652936078 +0000 UTC m=+249.519416465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:43:25 crc kubenswrapper[4776]: I1204 09:43:25.191621 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff"} Dec 04 09:43:25 crc kubenswrapper[4776]: I1204 09:43:25.192165 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"17364c7e18d7697b8ac3ee0ff91efa97557e7c82231c6c44371595902f905cee"} Dec 04 09:43:25 crc kubenswrapper[4776]: I1204 09:43:25.193162 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:25 crc kubenswrapper[4776]: E1204 09:43:25.193231 4776 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.173:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:43:25 crc kubenswrapper[4776]: I1204 09:43:25.455031 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:25 crc kubenswrapper[4776]: E1204 09:43:25.781222 4776 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187df9e8313d5b8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 09:43:24.652936078 +0000 UTC m=+249.519416465,LastTimestamp:2025-12-04 09:43:24.652936078 +0000 UTC m=+249.519416465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 09:43:28 crc kubenswrapper[4776]: E1204 09:43:28.963130 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:28 crc kubenswrapper[4776]: E1204 09:43:28.964153 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:28 crc kubenswrapper[4776]: E1204 09:43:28.964684 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:28 crc kubenswrapper[4776]: E1204 09:43:28.965057 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:28 crc kubenswrapper[4776]: E1204 09:43:28.965336 4776 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:28 crc kubenswrapper[4776]: I1204 09:43:28.965371 4776 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 09:43:28 crc kubenswrapper[4776]: E1204 09:43:28.965694 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="200ms" Dec 04 09:43:29 crc kubenswrapper[4776]: E1204 09:43:29.166738 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="400ms" Dec 04 09:43:29 crc kubenswrapper[4776]: E1204 09:43:29.567778 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="800ms" Dec 04 09:43:30 crc kubenswrapper[4776]: E1204 09:43:30.369387 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="1.6s" Dec 04 09:43:31 crc kubenswrapper[4776]: E1204 09:43:31.970281 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="3.2s" Dec 04 09:43:32 crc kubenswrapper[4776]: I1204 09:43:32.236079 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 09:43:32 crc kubenswrapper[4776]: I1204 09:43:32.236381 4776 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317" exitCode=1 Dec 04 09:43:32 crc kubenswrapper[4776]: I1204 09:43:32.236430 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317"} Dec 04 09:43:32 crc kubenswrapper[4776]: I1204 09:43:32.237105 4776 scope.go:117] "RemoveContainer" containerID="79dfcaaa037eaf0bbce948d05d4005025e54776b20b1767054b54df011ef4317" Dec 04 09:43:32 crc kubenswrapper[4776]: I1204 09:43:32.237327 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:32 crc kubenswrapper[4776]: I1204 09:43:32.237670 4776 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:33 crc kubenswrapper[4776]: I1204 09:43:33.019903 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:33 crc kubenswrapper[4776]: I1204 09:43:33.246175 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 09:43:33 crc kubenswrapper[4776]: I1204 09:43:33.246662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"afa1f8efe136c1c0adaa7c588964cf2b83e03b7211f324eb893ecb4dadd8bab5"} Dec 04 09:43:33 crc kubenswrapper[4776]: I1204 09:43:33.247861 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:33 crc kubenswrapper[4776]: I1204 09:43:33.248526 4776 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:34 crc kubenswrapper[4776]: I1204 09:43:34.452258 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:34 crc kubenswrapper[4776]: I1204 09:43:34.453428 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:34 crc kubenswrapper[4776]: I1204 09:43:34.454114 4776 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:34 crc kubenswrapper[4776]: I1204 09:43:34.474349 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:34 crc kubenswrapper[4776]: I1204 09:43:34.474401 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:34 crc kubenswrapper[4776]: E1204 09:43:34.475251 4776 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:34 crc kubenswrapper[4776]: I1204 09:43:34.476281 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:35 crc kubenswrapper[4776]: E1204 09:43:35.171864 4776 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="6.4s" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.257980 4776 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ba078a8f5161a1cfdffea28cb0dc4138408cfc4c516b9c8161d3c2c93194906f" exitCode=0 Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.258035 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ba078a8f5161a1cfdffea28cb0dc4138408cfc4c516b9c8161d3c2c93194906f"} Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.258072 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d6aa321c329205322a204b3e983f4af4bb3948cb42baaed165944a5da653951a"} Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.258329 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.258343 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:35 crc kubenswrapper[4776]: E1204 09:43:35.258777 4776 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.258936 4776 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.259306 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.457107 4776 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.457373 4776 status_manager.go:851] "Failed to get status for pod" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:35 crc kubenswrapper[4776]: I1204 09:43:35.457594 4776 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.173:6443: connect: connection refused" Dec 04 09:43:36 crc kubenswrapper[4776]: I1204 09:43:36.269149 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7dc2944310cf9a16ed77af411d45d432a01d277201061b11b3ffe389687a951b"} Dec 04 09:43:36 crc kubenswrapper[4776]: I1204 09:43:36.272084 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2ee6a85ef03fe32bff14264167f39ae30af8c9db01536668581dc2836101263b"} Dec 04 09:43:36 crc kubenswrapper[4776]: I1204 09:43:36.272109 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"581ed2f37d85c36e0c188cc5c78579a290de1ca6721a22483404e408e1188956"} Dec 04 09:43:36 crc kubenswrapper[4776]: I1204 09:43:36.272119 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72f055ddc4d9d5c9b47972b82549dff0806f48f7660306a948f89429c1b16775"} Dec 04 09:43:37 crc kubenswrapper[4776]: I1204 09:43:37.276537 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a2f742b2f11057542b0e0f9ccc9f5fc08428bf48bb8811ff396c7139bef1f740"} Dec 04 09:43:37 crc kubenswrapper[4776]: I1204 09:43:37.276775 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:37 crc kubenswrapper[4776]: I1204 09:43:37.278204 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:37 crc kubenswrapper[4776]: I1204 09:43:37.278208 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:38 crc kubenswrapper[4776]: I1204 09:43:38.570225 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:38 crc kubenswrapper[4776]: I1204 09:43:38.574839 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:39 crc kubenswrapper[4776]: I1204 09:43:39.287328 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:39 crc kubenswrapper[4776]: I1204 09:43:39.477373 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:39 crc kubenswrapper[4776]: I1204 09:43:39.477449 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:39 crc kubenswrapper[4776]: I1204 09:43:39.484044 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:42 crc kubenswrapper[4776]: I1204 09:43:42.285323 4776 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:42 crc kubenswrapper[4776]: I1204 09:43:42.301241 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:42 crc kubenswrapper[4776]: I1204 09:43:42.301275 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:42 crc kubenswrapper[4776]: I1204 09:43:42.307510 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:43:42 crc kubenswrapper[4776]: I1204 09:43:42.310558 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a2ca3443-19b9-4ffb-9b8e-46b4082d6674" Dec 04 09:43:43 crc kubenswrapper[4776]: I1204 09:43:43.023684 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 09:43:43 crc kubenswrapper[4776]: I1204 09:43:43.366195 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:43 crc kubenswrapper[4776]: I1204 09:43:43.366225 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:43:43 crc kubenswrapper[4776]: I1204 09:43:43.374030 4776 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a2ca3443-19b9-4ffb-9b8e-46b4082d6674" Dec 04 09:43:45 crc kubenswrapper[4776]: E1204 09:43:45.473035 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 09:43:45 crc kubenswrapper[4776]: E1204 09:43:45.483812 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 09:43:45 crc kubenswrapper[4776]: E1204 09:43:45.503209 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.603317 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.603404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.604417 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.610647 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.907940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.908017 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.911539 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:45 crc kubenswrapper[4776]: I1204 09:43:45.911709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:52 crc kubenswrapper[4776]: I1204 09:43:52.542912 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 09:43:52 crc kubenswrapper[4776]: I1204 09:43:52.738571 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 09:43:52 crc kubenswrapper[4776]: I1204 09:43:52.896577 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.118662 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.148958 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.202018 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.246111 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.481580 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.523235 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.648696 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 09:43:53 crc kubenswrapper[4776]: I1204 09:43:53.788453 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.026040 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.079845 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.134011 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.151730 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.182306 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.208118 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.237895 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.367809 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.763272 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.773608 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.790654 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 09:43:54 crc kubenswrapper[4776]: I1204 09:43:54.976363 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.017446 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.030192 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.056390 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.084362 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.145956 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.213929 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.224802 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.242038 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.321151 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.427863 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.554468 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.595856 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.697692 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.883056 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.919072 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.947257 4776 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 09:43:55 crc kubenswrapper[4776]: I1204 09:43:55.974305 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.017338 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.086523 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.227050 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.242753 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.385079 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.446568 4776 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.505386 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.513008 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.584102 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.688680 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.698799 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.752417 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.753906 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.762224 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.804614 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 09:43:56 crc kubenswrapper[4776]: I1204 09:43:56.876977 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.030000 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.087231 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.151712 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.213287 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.342452 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.357130 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.375622 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.389869 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.410862 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.415273 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.434718 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.450067 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.452021 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.452280 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.653570 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.670290 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.685081 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 09:43:57 crc kubenswrapper[4776]: W1204 09:43:57.725730 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f763d7b5a2173e52c85d35f0720e8f925c635889380a4c240bab80fe27793e7d WatchSource:0}: Error finding container f763d7b5a2173e52c85d35f0720e8f925c635889380a4c240bab80fe27793e7d: Status 404 returned error can't find the container with id f763d7b5a2173e52c85d35f0720e8f925c635889380a4c240bab80fe27793e7d Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.734795 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.770872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.781170 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.845070 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.906938 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 09:43:57 crc kubenswrapper[4776]: I1204 09:43:57.907089 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.061787 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.158814 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.175394 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.178330 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.288594 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.362768 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.389799 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.432469 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.459783 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3174cc2e97e1262b1ba932a5fed88216935728d23f9c233e09cf112248859303"} Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.459841 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f763d7b5a2173e52c85d35f0720e8f925c635889380a4c240bab80fe27793e7d"} Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.461393 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.496588 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.721092 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.744885 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.795837 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.835504 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 09:43:58 crc kubenswrapper[4776]: I1204 09:43:58.853562 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.034881 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.078249 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.186766 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.296842 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.347135 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.451969 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.452072 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.452586 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.452711 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.487695 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.503494 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.519321 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.636737 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 09:43:59 crc kubenswrapper[4776]: W1204 09:43:59.717857 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-4a830301878c6a2d7ff54ceca14ec8d7c89e05302913bd67d930ca3ca85515b0 WatchSource:0}: Error finding container 4a830301878c6a2d7ff54ceca14ec8d7c89e05302913bd67d930ca3ca85515b0: Status 404 returned error can't find the container with id 4a830301878c6a2d7ff54ceca14ec8d7c89e05302913bd67d930ca3ca85515b0 Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.838049 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.877481 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.877517 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 09:43:59 crc kubenswrapper[4776]: I1204 09:43:59.901819 4776 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.009934 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.063796 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.070938 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.073907 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.082851 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.082851 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.095000 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.196155 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.206846 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.281357 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.304974 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.313992 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.411687 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.416749 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.438998 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.452432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.475282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"04f7ad8cea69f62d7771c4b375b4c5d47b2a09ac473ceae97a20b53a473614c7"} Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.475335 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4a830301878c6a2d7ff54ceca14ec8d7c89e05302913bd67d930ca3ca85515b0"} Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.479418 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2dc87bea429c3b4844ed8a2a86b5813e4557a5f8af049d2476e81705105e7ba8"} Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.479449 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"babcd58e1b0d6881132fc2b66b723e4f98358fa1ec6fa593a02a6c155250001f"} Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.479954 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.505382 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.531062 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.751167 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.800552 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.806754 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.879749 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.893705 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.939958 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:44:00 crc kubenswrapper[4776]: I1204 09:44:00.956337 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.048171 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.048809 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.053605 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.116081 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.145807 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.217255 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.243734 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.323180 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.443843 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.486605 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.486676 4776 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="04f7ad8cea69f62d7771c4b375b4c5d47b2a09ac473ceae97a20b53a473614c7" exitCode=255 Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.486778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"04f7ad8cea69f62d7771c4b375b4c5d47b2a09ac473ceae97a20b53a473614c7"} Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.487427 4776 scope.go:117] "RemoveContainer" containerID="04f7ad8cea69f62d7771c4b375b4c5d47b2a09ac473ceae97a20b53a473614c7" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.530693 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.581648 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.611628 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.809462 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.833027 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.856336 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.882776 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.961079 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 09:44:01 crc kubenswrapper[4776]: I1204 09:44:01.977960 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.041425 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.147301 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.225637 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.342501 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.368217 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.495580 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.495653 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b7003b43ddc76b204d3f787669eed5b95533c0fcddc44742c656e414057cef8d"} Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.518298 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.525363 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.686199 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.738042 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.748135 4776 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.753362 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.753422 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-7f687b986-fzscj"] Dec 04 09:44:02 crc kubenswrapper[4776]: E1204 09:44:02.753683 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" containerName="installer" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.753704 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" containerName="installer" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.753839 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3e726c-2adf-410e-8bd5-bcf6cebb2ac2" containerName="installer" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.753863 4776 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.753891 4776 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3e59a3c6-f022-4e05-a66d-a763ec43e08c" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.754352 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.760874 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.761270 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.761311 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.761505 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.761723 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762035 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762241 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762360 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762401 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762362 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762484 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.762964 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.763230 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.770070 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.771230 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.781215 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.809540 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.809518306 podStartE2EDuration="20.809518306s" podCreationTimestamp="2025-12-04 09:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:02.802089263 +0000 UTC m=+287.668569660" watchObservedRunningTime="2025-12-04 09:44:02.809518306 +0000 UTC m=+287.675998683" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835519 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc2sj\" (UniqueName: \"kubernetes.io/projected/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-kube-api-access-vc2sj\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835552 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835572 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-error\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835598 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835622 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835662 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835686 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835705 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-login\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835790 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835827 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-audit-dir\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835879 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-audit-policies\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835897 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.835937 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-session\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.937354 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.937430 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-audit-policies\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.937467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-session\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.937499 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.937725 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc2sj\" (UniqueName: \"kubernetes.io/projected/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-kube-api-access-vc2sj\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.937782 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-error\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938512 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938576 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938608 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938683 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-login\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938765 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-audit-policies\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938824 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-audit-dir\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.938789 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-audit-dir\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.939064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.939312 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.939513 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.943714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.943970 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.944022 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.946931 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.947180 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-session\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.949348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-error\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.952002 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.958564 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc2sj\" (UniqueName: \"kubernetes.io/projected/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-kube-api-access-vc2sj\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.959509 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a0d235b-7868-466a-bc3a-4f1a8eaf1299-v4-0-config-user-template-login\") pod \"oauth-openshift-7f687b986-fzscj\" (UID: \"6a0d235b-7868-466a-bc3a-4f1a8eaf1299\") " pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:02 crc kubenswrapper[4776]: I1204 09:44:02.980229 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.084765 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.278658 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.380117 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.460140 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.506160 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f687b986-fzscj"] Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.506332 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.508711 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 09:44:03 crc kubenswrapper[4776]: W1204 09:44:03.512180 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0d235b_7868_466a_bc3a_4f1a8eaf1299.slice/crio-82a30484b52936dd866b0653939884bcd96f09c6fa41191354510ee7eed493ca WatchSource:0}: Error finding container 82a30484b52936dd866b0653939884bcd96f09c6fa41191354510ee7eed493ca: Status 404 returned error can't find the container with id 82a30484b52936dd866b0653939884bcd96f09c6fa41191354510ee7eed493ca Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.515573 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.550020 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.628516 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.643693 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.749417 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.775424 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.806671 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.851986 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.944446 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.973587 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 09:44:03 crc kubenswrapper[4776]: I1204 09:44:03.996803 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.003172 4776 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.068628 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.094817 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.119159 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.164485 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.183506 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.230044 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.254850 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.269321 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.305585 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.366901 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.416734 4776 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.508193 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" event={"ID":"6a0d235b-7868-466a-bc3a-4f1a8eaf1299","Type":"ContainerStarted","Data":"f0909d840056e8bf6bc30f2e4d9f4c6b621c842b7b9af624317488ddefadf9fb"} Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.508253 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" event={"ID":"6a0d235b-7868-466a-bc3a-4f1a8eaf1299","Type":"ContainerStarted","Data":"82a30484b52936dd866b0653939884bcd96f09c6fa41191354510ee7eed493ca"} Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.508550 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.514475 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.530470 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7f687b986-fzscj" podStartSLOduration=72.530451728 podStartE2EDuration="1m12.530451728s" podCreationTimestamp="2025-12-04 09:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:04.530157248 +0000 UTC m=+289.396637625" watchObservedRunningTime="2025-12-04 09:44:04.530451728 +0000 UTC m=+289.396932105" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.583565 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.631515 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.644079 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.751776 4776 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.752136 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff" gracePeriod=5 Dec 04 09:44:04 crc kubenswrapper[4776]: I1204 09:44:04.938520 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.064212 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.149212 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.173706 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.239164 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.258813 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.348795 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.440432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.531649 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.574240 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.788666 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.903308 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.987816 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 09:44:05 crc kubenswrapper[4776]: I1204 09:44:05.993193 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.241532 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.264598 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.278890 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.288058 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.295266 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.344351 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.348712 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.470129 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.486907 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.487885 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.545404 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.603063 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.615503 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.797307 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.834101 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.852733 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.873049 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 09:44:06 crc kubenswrapper[4776]: I1204 09:44:06.971931 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.150618 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.269294 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.323611 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.375330 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.451110 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.538753 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.689667 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.770069 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 09:44:07 crc kubenswrapper[4776]: I1204 09:44:07.953943 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 09:44:08 crc kubenswrapper[4776]: I1204 09:44:08.286313 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 09:44:08 crc kubenswrapper[4776]: I1204 09:44:08.433967 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 09:44:09 crc kubenswrapper[4776]: I1204 09:44:09.240405 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.323757 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.324112 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332163 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332237 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332251 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332274 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332320 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332360 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332406 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332440 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332516 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332713 4776 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332731 4776 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332742 4776 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.332757 4776 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.339856 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.434244 4776 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.557833 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.557947 4776 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff" exitCode=137 Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.558013 4776 scope.go:117] "RemoveContainer" containerID="d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.558036 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.574831 4776 scope.go:117] "RemoveContainer" containerID="d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff" Dec 04 09:44:10 crc kubenswrapper[4776]: E1204 09:44:10.576638 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff\": container with ID starting with d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff not found: ID does not exist" containerID="d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff" Dec 04 09:44:10 crc kubenswrapper[4776]: I1204 09:44:10.576712 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff"} err="failed to get container status \"d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff\": rpc error: code = NotFound desc = could not find container \"d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff\": container with ID starting with d4d30bbc4b844caa94b6632c5448d7dc8ca5591a846936a1537e6479be80daff not found: ID does not exist" Dec 04 09:44:11 crc kubenswrapper[4776]: I1204 09:44:11.471259 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 04 09:44:15 crc kubenswrapper[4776]: I1204 09:44:15.292499 4776 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 04 09:44:16 crc kubenswrapper[4776]: I1204 09:44:16.184454 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 09:44:21 crc kubenswrapper[4776]: I1204 09:44:21.625132 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerID="143f2f0293a3c439f83258d2d51d12e0a8d50d98009d90d4bed6731e554750ce" exitCode=0 Dec 04 09:44:21 crc kubenswrapper[4776]: I1204 09:44:21.625238 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" event={"ID":"1e450f38-92b1-4da3-8cb6-353756403eb6","Type":"ContainerDied","Data":"143f2f0293a3c439f83258d2d51d12e0a8d50d98009d90d4bed6731e554750ce"} Dec 04 09:44:21 crc kubenswrapper[4776]: I1204 09:44:21.625960 4776 scope.go:117] "RemoveContainer" containerID="143f2f0293a3c439f83258d2d51d12e0a8d50d98009d90d4bed6731e554750ce" Dec 04 09:44:22 crc kubenswrapper[4776]: I1204 09:44:22.635148 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" event={"ID":"1e450f38-92b1-4da3-8cb6-353756403eb6","Type":"ContainerStarted","Data":"90bbdb3013f0417cd8348f6623a5fc85c154b460de3b4249280c6275a8a11d4d"} Dec 04 09:44:22 crc kubenswrapper[4776]: I1204 09:44:22.636107 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:44:22 crc kubenswrapper[4776]: I1204 09:44:22.640285 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:44:25 crc kubenswrapper[4776]: I1204 09:44:25.339653 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 09:44:29 crc kubenswrapper[4776]: I1204 09:44:29.463864 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.285394 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zkz5f"] Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.285899 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" podUID="0c28ea18-b69e-4407-9e39-9a743bc3131c" containerName="controller-manager" containerID="cri-o://cef1c3942b6915a5dd71c8fec98fa8d9f7aec0aad04d971215e933c8c2e17ad7" gracePeriod=30 Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.386062 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md"] Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.386324 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" podUID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" containerName="route-controller-manager" containerID="cri-o://1d7ad689071946a30cdfd8dbae9c8c0a78414277bde4b512403d2bd486d7939a" gracePeriod=30 Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.657791 4776 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zkz5f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.657844 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" podUID="0c28ea18-b69e-4407-9e39-9a743bc3131c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.692150 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c28ea18-b69e-4407-9e39-9a743bc3131c" containerID="cef1c3942b6915a5dd71c8fec98fa8d9f7aec0aad04d971215e933c8c2e17ad7" exitCode=0 Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.692249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" event={"ID":"0c28ea18-b69e-4407-9e39-9a743bc3131c","Type":"ContainerDied","Data":"cef1c3942b6915a5dd71c8fec98fa8d9f7aec0aad04d971215e933c8c2e17ad7"} Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.693364 4776 generic.go:334] "Generic (PLEG): container finished" podID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" containerID="1d7ad689071946a30cdfd8dbae9c8c0a78414277bde4b512403d2bd486d7939a" exitCode=0 Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.693393 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" event={"ID":"0cf14439-b34b-4036-bdb0-a9197b92d3d5","Type":"ContainerDied","Data":"1d7ad689071946a30cdfd8dbae9c8c0a78414277bde4b512403d2bd486d7939a"} Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.794696 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.832774 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-config\") pod \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.832938 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz2bv\" (UniqueName: \"kubernetes.io/projected/0cf14439-b34b-4036-bdb0-a9197b92d3d5-kube-api-access-nz2bv\") pod \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.832990 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf14439-b34b-4036-bdb0-a9197b92d3d5-serving-cert\") pod \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.833021 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-client-ca\") pod \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\" (UID: \"0cf14439-b34b-4036-bdb0-a9197b92d3d5\") " Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.834273 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "0cf14439-b34b-4036-bdb0-a9197b92d3d5" (UID: "0cf14439-b34b-4036-bdb0-a9197b92d3d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.834458 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-config" (OuterVolumeSpecName: "config") pod "0cf14439-b34b-4036-bdb0-a9197b92d3d5" (UID: "0cf14439-b34b-4036-bdb0-a9197b92d3d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.838253 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf14439-b34b-4036-bdb0-a9197b92d3d5-kube-api-access-nz2bv" (OuterVolumeSpecName: "kube-api-access-nz2bv") pod "0cf14439-b34b-4036-bdb0-a9197b92d3d5" (UID: "0cf14439-b34b-4036-bdb0-a9197b92d3d5"). InnerVolumeSpecName "kube-api-access-nz2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.838455 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf14439-b34b-4036-bdb0-a9197b92d3d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0cf14439-b34b-4036-bdb0-a9197b92d3d5" (UID: "0cf14439-b34b-4036-bdb0-a9197b92d3d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.935261 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz2bv\" (UniqueName: \"kubernetes.io/projected/0cf14439-b34b-4036-bdb0-a9197b92d3d5-kube-api-access-nz2bv\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.935303 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf14439-b34b-4036-bdb0-a9197b92d3d5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.935316 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:33 crc kubenswrapper[4776]: I1204 09:44:33.935545 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cf14439-b34b-4036-bdb0-a9197b92d3d5-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.080108 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.138243 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5vt4\" (UniqueName: \"kubernetes.io/projected/0c28ea18-b69e-4407-9e39-9a743bc3131c-kube-api-access-n5vt4\") pod \"0c28ea18-b69e-4407-9e39-9a743bc3131c\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.138292 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-client-ca\") pod \"0c28ea18-b69e-4407-9e39-9a743bc3131c\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.138604 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-config\") pod \"0c28ea18-b69e-4407-9e39-9a743bc3131c\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.138637 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-proxy-ca-bundles\") pod \"0c28ea18-b69e-4407-9e39-9a743bc3131c\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.138664 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c28ea18-b69e-4407-9e39-9a743bc3131c-serving-cert\") pod \"0c28ea18-b69e-4407-9e39-9a743bc3131c\" (UID: \"0c28ea18-b69e-4407-9e39-9a743bc3131c\") " Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.139652 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0c28ea18-b69e-4407-9e39-9a743bc3131c" (UID: "0c28ea18-b69e-4407-9e39-9a743bc3131c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.139686 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-client-ca" (OuterVolumeSpecName: "client-ca") pod "0c28ea18-b69e-4407-9e39-9a743bc3131c" (UID: "0c28ea18-b69e-4407-9e39-9a743bc3131c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.141180 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-config" (OuterVolumeSpecName: "config") pod "0c28ea18-b69e-4407-9e39-9a743bc3131c" (UID: "0c28ea18-b69e-4407-9e39-9a743bc3131c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.142950 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c28ea18-b69e-4407-9e39-9a743bc3131c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0c28ea18-b69e-4407-9e39-9a743bc3131c" (UID: "0c28ea18-b69e-4407-9e39-9a743bc3131c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.143041 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c28ea18-b69e-4407-9e39-9a743bc3131c-kube-api-access-n5vt4" (OuterVolumeSpecName: "kube-api-access-n5vt4") pod "0c28ea18-b69e-4407-9e39-9a743bc3131c" (UID: "0c28ea18-b69e-4407-9e39-9a743bc3131c"). InnerVolumeSpecName "kube-api-access-n5vt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.239950 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5vt4\" (UniqueName: \"kubernetes.io/projected/0c28ea18-b69e-4407-9e39-9a743bc3131c-kube-api-access-n5vt4\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.239986 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.240000 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.240012 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c28ea18-b69e-4407-9e39-9a743bc3131c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.240031 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c28ea18-b69e-4407-9e39-9a743bc3131c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.701536 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.701534 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zkz5f" event={"ID":"0c28ea18-b69e-4407-9e39-9a743bc3131c","Type":"ContainerDied","Data":"9b8c6762fd9196657ff39f7bbd6c0a001215f82475de77244c7151159d383537"} Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.702823 4776 scope.go:117] "RemoveContainer" containerID="cef1c3942b6915a5dd71c8fec98fa8d9f7aec0aad04d971215e933c8c2e17ad7" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.703520 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" event={"ID":"0cf14439-b34b-4036-bdb0-a9197b92d3d5","Type":"ContainerDied","Data":"dc94fc05e5e89e9ea73ee06366e63c9095405a33cbc160809989e9d69ca67694"} Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.703602 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.723154 4776 scope.go:117] "RemoveContainer" containerID="1d7ad689071946a30cdfd8dbae9c8c0a78414277bde4b512403d2bd486d7939a" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.740498 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2vg8p"] Dec 04 09:44:34 crc kubenswrapper[4776]: E1204 09:44:34.740857 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.740884 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 09:44:34 crc kubenswrapper[4776]: E1204 09:44:34.740905 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c28ea18-b69e-4407-9e39-9a743bc3131c" containerName="controller-manager" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.740936 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c28ea18-b69e-4407-9e39-9a743bc3131c" containerName="controller-manager" Dec 04 09:44:34 crc kubenswrapper[4776]: E1204 09:44:34.740951 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" containerName="route-controller-manager" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.740959 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" containerName="route-controller-manager" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.741083 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" containerName="route-controller-manager" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.741104 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.741123 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c28ea18-b69e-4407-9e39-9a743bc3131c" containerName="controller-manager" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.741659 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.744644 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.744687 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.744864 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.744967 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.744908 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.746510 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.748860 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zkz5f"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.752331 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.755693 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zkz5f"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.760624 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2vg8p"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.765134 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-l8672"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.766190 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.769543 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.772442 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.772509 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.772447 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.772684 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.772738 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.772798 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.774011 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g96md"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.781061 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-l8672"] Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.846884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-client-ca\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.846963 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx9lv\" (UniqueName: \"kubernetes.io/projected/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-kube-api-access-jx9lv\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.846987 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-client-ca\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.847012 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-config\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.847027 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-config\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.847044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-proxy-ca-bundles\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.847074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e8a6dc-9de3-4c5d-9481-b69908565c43-serving-cert\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.847142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvpk\" (UniqueName: \"kubernetes.io/projected/08e8a6dc-9de3-4c5d-9481-b69908565c43-kube-api-access-kzvpk\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.847161 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-serving-cert\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.948850 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e8a6dc-9de3-4c5d-9481-b69908565c43-serving-cert\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.949690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvpk\" (UniqueName: \"kubernetes.io/projected/08e8a6dc-9de3-4c5d-9481-b69908565c43-kube-api-access-kzvpk\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950025 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-serving-cert\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950252 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-client-ca\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950355 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx9lv\" (UniqueName: \"kubernetes.io/projected/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-kube-api-access-jx9lv\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950451 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-client-ca\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950554 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-config\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950621 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-config\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.950697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-proxy-ca-bundles\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.951285 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-client-ca\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.952045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-proxy-ca-bundles\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.952132 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-client-ca\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.952727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e8a6dc-9de3-4c5d-9481-b69908565c43-serving-cert\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.952730 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-serving-cert\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.953353 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-config\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.953492 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-config\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.969439 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx9lv\" (UniqueName: \"kubernetes.io/projected/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-kube-api-access-jx9lv\") pod \"route-controller-manager-7788564f84-l8672\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:34 crc kubenswrapper[4776]: I1204 09:44:34.969976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvpk\" (UniqueName: \"kubernetes.io/projected/08e8a6dc-9de3-4c5d-9481-b69908565c43-kube-api-access-kzvpk\") pod \"controller-manager-6c667744dd-2vg8p\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.058293 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.084302 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.270207 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2vg8p"] Dec 04 09:44:35 crc kubenswrapper[4776]: W1204 09:44:35.275752 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e8a6dc_9de3_4c5d_9481_b69908565c43.slice/crio-0432117ce5a78be3f9665894e4da0c974b5497024d148dbe76ea580da4fa57d6 WatchSource:0}: Error finding container 0432117ce5a78be3f9665894e4da0c974b5497024d148dbe76ea580da4fa57d6: Status 404 returned error can't find the container with id 0432117ce5a78be3f9665894e4da0c974b5497024d148dbe76ea580da4fa57d6 Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.334290 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-l8672"] Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.462648 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c28ea18-b69e-4407-9e39-9a743bc3131c" path="/var/lib/kubelet/pods/0c28ea18-b69e-4407-9e39-9a743bc3131c/volumes" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.463738 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf14439-b34b-4036-bdb0-a9197b92d3d5" path="/var/lib/kubelet/pods/0cf14439-b34b-4036-bdb0-a9197b92d3d5/volumes" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.712483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" event={"ID":"08e8a6dc-9de3-4c5d-9481-b69908565c43","Type":"ContainerStarted","Data":"577d18556a4a4a39890a95a9be93d0e4be4f9b0adbfce009e88ebeb10840a2c5"} Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.712535 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" event={"ID":"08e8a6dc-9de3-4c5d-9481-b69908565c43","Type":"ContainerStarted","Data":"0432117ce5a78be3f9665894e4da0c974b5497024d148dbe76ea580da4fa57d6"} Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.712747 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.714802 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" event={"ID":"ee242362-b6d2-4bb8-82f6-57d8cd4f6497","Type":"ContainerStarted","Data":"1eb5ab6cc85396a1b24853c840b0121199115ba5d80b21350bdd02508e86ea88"} Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.714831 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" event={"ID":"ee242362-b6d2-4bb8-82f6-57d8cd4f6497","Type":"ContainerStarted","Data":"b8cbcb0cd62a42d1fa2515e2df142c48d7daac72280fc15e34ccc0d09aca3ffa"} Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.715198 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.720015 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.736560 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" podStartSLOduration=1.736542165 podStartE2EDuration="1.736542165s" podCreationTimestamp="2025-12-04 09:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:35.732961663 +0000 UTC m=+320.599442060" watchObservedRunningTime="2025-12-04 09:44:35.736542165 +0000 UTC m=+320.603022542" Dec 04 09:44:35 crc kubenswrapper[4776]: I1204 09:44:35.753268 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" podStartSLOduration=1.753246138 podStartE2EDuration="1.753246138s" podCreationTimestamp="2025-12-04 09:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:35.751342659 +0000 UTC m=+320.617823066" watchObservedRunningTime="2025-12-04 09:44:35.753246138 +0000 UTC m=+320.619726515" Dec 04 09:44:36 crc kubenswrapper[4776]: I1204 09:44:36.115707 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.348327 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2vg8p"] Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.348879 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" podUID="08e8a6dc-9de3-4c5d-9481-b69908565c43" containerName="controller-manager" containerID="cri-o://577d18556a4a4a39890a95a9be93d0e4be4f9b0adbfce009e88ebeb10840a2c5" gracePeriod=30 Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.363340 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-l8672"] Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.363537 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" podUID="ee242362-b6d2-4bb8-82f6-57d8cd4f6497" containerName="route-controller-manager" containerID="cri-o://1eb5ab6cc85396a1b24853c840b0121199115ba5d80b21350bdd02508e86ea88" gracePeriod=30 Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.756184 4776 generic.go:334] "Generic (PLEG): container finished" podID="08e8a6dc-9de3-4c5d-9481-b69908565c43" containerID="577d18556a4a4a39890a95a9be93d0e4be4f9b0adbfce009e88ebeb10840a2c5" exitCode=0 Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.756291 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" event={"ID":"08e8a6dc-9de3-4c5d-9481-b69908565c43","Type":"ContainerDied","Data":"577d18556a4a4a39890a95a9be93d0e4be4f9b0adbfce009e88ebeb10840a2c5"} Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.759192 4776 generic.go:334] "Generic (PLEG): container finished" podID="ee242362-b6d2-4bb8-82f6-57d8cd4f6497" containerID="1eb5ab6cc85396a1b24853c840b0121199115ba5d80b21350bdd02508e86ea88" exitCode=0 Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.759242 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" event={"ID":"ee242362-b6d2-4bb8-82f6-57d8cd4f6497","Type":"ContainerDied","Data":"1eb5ab6cc85396a1b24853c840b0121199115ba5d80b21350bdd02508e86ea88"} Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.905319 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.910143 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950164 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e8a6dc-9de3-4c5d-9481-b69908565c43-serving-cert\") pod \"08e8a6dc-9de3-4c5d-9481-b69908565c43\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950214 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-config\") pod \"08e8a6dc-9de3-4c5d-9481-b69908565c43\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950273 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx9lv\" (UniqueName: \"kubernetes.io/projected/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-kube-api-access-jx9lv\") pod \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950289 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-client-ca\") pod \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950307 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-serving-cert\") pod \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950324 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-client-ca\") pod \"08e8a6dc-9de3-4c5d-9481-b69908565c43\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950339 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-config\") pod \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\" (UID: \"ee242362-b6d2-4bb8-82f6-57d8cd4f6497\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950354 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-proxy-ca-bundles\") pod \"08e8a6dc-9de3-4c5d-9481-b69908565c43\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.950396 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzvpk\" (UniqueName: \"kubernetes.io/projected/08e8a6dc-9de3-4c5d-9481-b69908565c43-kube-api-access-kzvpk\") pod \"08e8a6dc-9de3-4c5d-9481-b69908565c43\" (UID: \"08e8a6dc-9de3-4c5d-9481-b69908565c43\") " Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.951610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee242362-b6d2-4bb8-82f6-57d8cd4f6497" (UID: "ee242362-b6d2-4bb8-82f6-57d8cd4f6497"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.951685 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-config" (OuterVolumeSpecName: "config") pod "ee242362-b6d2-4bb8-82f6-57d8cd4f6497" (UID: "ee242362-b6d2-4bb8-82f6-57d8cd4f6497"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.952442 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "08e8a6dc-9de3-4c5d-9481-b69908565c43" (UID: "08e8a6dc-9de3-4c5d-9481-b69908565c43"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.952622 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-client-ca" (OuterVolumeSpecName: "client-ca") pod "08e8a6dc-9de3-4c5d-9481-b69908565c43" (UID: "08e8a6dc-9de3-4c5d-9481-b69908565c43"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.952657 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-config" (OuterVolumeSpecName: "config") pod "08e8a6dc-9de3-4c5d-9481-b69908565c43" (UID: "08e8a6dc-9de3-4c5d-9481-b69908565c43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.957467 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee242362-b6d2-4bb8-82f6-57d8cd4f6497" (UID: "ee242362-b6d2-4bb8-82f6-57d8cd4f6497"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.961955 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-kube-api-access-jx9lv" (OuterVolumeSpecName: "kube-api-access-jx9lv") pod "ee242362-b6d2-4bb8-82f6-57d8cd4f6497" (UID: "ee242362-b6d2-4bb8-82f6-57d8cd4f6497"). InnerVolumeSpecName "kube-api-access-jx9lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.962506 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e8a6dc-9de3-4c5d-9481-b69908565c43-kube-api-access-kzvpk" (OuterVolumeSpecName: "kube-api-access-kzvpk") pod "08e8a6dc-9de3-4c5d-9481-b69908565c43" (UID: "08e8a6dc-9de3-4c5d-9481-b69908565c43"). InnerVolumeSpecName "kube-api-access-kzvpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:44:41 crc kubenswrapper[4776]: I1204 09:44:41.962592 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e8a6dc-9de3-4c5d-9481-b69908565c43-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "08e8a6dc-9de3-4c5d-9481-b69908565c43" (UID: "08e8a6dc-9de3-4c5d-9481-b69908565c43"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051270 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzvpk\" (UniqueName: \"kubernetes.io/projected/08e8a6dc-9de3-4c5d-9481-b69908565c43-kube-api-access-kzvpk\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051320 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e8a6dc-9de3-4c5d-9481-b69908565c43-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051341 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051353 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx9lv\" (UniqueName: \"kubernetes.io/projected/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-kube-api-access-jx9lv\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051365 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051374 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051384 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051398 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee242362-b6d2-4bb8-82f6-57d8cd4f6497-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.051411 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08e8a6dc-9de3-4c5d-9481-b69908565c43-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.766761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" event={"ID":"08e8a6dc-9de3-4c5d-9481-b69908565c43","Type":"ContainerDied","Data":"0432117ce5a78be3f9665894e4da0c974b5497024d148dbe76ea580da4fa57d6"} Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.766798 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c667744dd-2vg8p" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.766826 4776 scope.go:117] "RemoveContainer" containerID="577d18556a4a4a39890a95a9be93d0e4be4f9b0adbfce009e88ebeb10840a2c5" Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.770276 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" event={"ID":"ee242362-b6d2-4bb8-82f6-57d8cd4f6497","Type":"ContainerDied","Data":"b8cbcb0cd62a42d1fa2515e2df142c48d7daac72280fc15e34ccc0d09aca3ffa"} Dec 04 09:44:42 crc kubenswrapper[4776]: I1204 09:44:42.770337 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-l8672" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.653016 4776 scope.go:117] "RemoveContainer" containerID="1eb5ab6cc85396a1b24853c840b0121199115ba5d80b21350bdd02508e86ea88" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.685834 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.701093 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-674567bd9b-4hkds"] Dec 04 09:44:43 crc kubenswrapper[4776]: E1204 09:44:43.701867 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e8a6dc-9de3-4c5d-9481-b69908565c43" containerName="controller-manager" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.702665 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e8a6dc-9de3-4c5d-9481-b69908565c43" containerName="controller-manager" Dec 04 09:44:43 crc kubenswrapper[4776]: E1204 09:44:43.702693 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee242362-b6d2-4bb8-82f6-57d8cd4f6497" containerName="route-controller-manager" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.702702 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee242362-b6d2-4bb8-82f6-57d8cd4f6497" containerName="route-controller-manager" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.702879 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e8a6dc-9de3-4c5d-9481-b69908565c43" containerName="controller-manager" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.702938 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee242362-b6d2-4bb8-82f6-57d8cd4f6497" containerName="route-controller-manager" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.703840 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.709427 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.709659 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.709748 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.709879 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.710948 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.711849 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.711890 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.714048 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.716039 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-674567bd9b-4hkds"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.716875 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.717227 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.718181 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.718373 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.718429 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.718468 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.719827 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.723724 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.724584 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2vg8p"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.728751 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2vg8p"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.733675 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-l8672"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.736991 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-l8672"] Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.884992 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-serving-cert\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885049 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-client-ca\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-client-ca\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-config\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-proxy-ca-bundles\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885476 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhq2g\" (UniqueName: \"kubernetes.io/projected/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-kube-api-access-bhq2g\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9sz\" (UniqueName: \"kubernetes.io/projected/c042ac06-d4ba-45e8-830d-a3d13087097f-kube-api-access-hr9sz\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885685 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c042ac06-d4ba-45e8-830d-a3d13087097f-serving-cert\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.885832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-config\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.987873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-proxy-ca-bundles\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.987940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhq2g\" (UniqueName: \"kubernetes.io/projected/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-kube-api-access-bhq2g\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.987970 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9sz\" (UniqueName: \"kubernetes.io/projected/c042ac06-d4ba-45e8-830d-a3d13087097f-kube-api-access-hr9sz\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.987993 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c042ac06-d4ba-45e8-830d-a3d13087097f-serving-cert\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.988036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-config\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.988085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-serving-cert\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.988114 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-client-ca\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.988141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-client-ca\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.988169 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-config\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.989158 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-client-ca\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.989254 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-proxy-ca-bundles\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.989563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-config\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.990000 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-client-ca\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.990281 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-config\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.992978 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-serving-cert\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:43 crc kubenswrapper[4776]: I1204 09:44:43.995055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c042ac06-d4ba-45e8-830d-a3d13087097f-serving-cert\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.007176 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9sz\" (UniqueName: \"kubernetes.io/projected/c042ac06-d4ba-45e8-830d-a3d13087097f-kube-api-access-hr9sz\") pod \"controller-manager-674567bd9b-4hkds\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.007490 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhq2g\" (UniqueName: \"kubernetes.io/projected/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-kube-api-access-bhq2g\") pod \"route-controller-manager-5c568499db-cdvg9\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.026355 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.033786 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.259138 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-674567bd9b-4hkds"] Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.537395 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9"] Dec 04 09:44:44 crc kubenswrapper[4776]: W1204 09:44:44.542191 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f1acd9_24cf_4823_aaf5_c2b4f43bc8cb.slice/crio-da1d7dca769f5d789c70d15a9c70d1d84e19c67df7b2f4342b0a68fc49dd70b5 WatchSource:0}: Error finding container da1d7dca769f5d789c70d15a9c70d1d84e19c67df7b2f4342b0a68fc49dd70b5: Status 404 returned error can't find the container with id da1d7dca769f5d789c70d15a9c70d1d84e19c67df7b2f4342b0a68fc49dd70b5 Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.792373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" event={"ID":"c042ac06-d4ba-45e8-830d-a3d13087097f","Type":"ContainerStarted","Data":"11c7121ce6a4b6ea3ecdb61fe60e9ba0a64f2f3d9ccb7620b4dfbde0734fc33a"} Dec 04 09:44:44 crc kubenswrapper[4776]: I1204 09:44:44.793869 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" event={"ID":"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb","Type":"ContainerStarted","Data":"da1d7dca769f5d789c70d15a9c70d1d84e19c67df7b2f4342b0a68fc49dd70b5"} Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.459205 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e8a6dc-9de3-4c5d-9481-b69908565c43" path="/var/lib/kubelet/pods/08e8a6dc-9de3-4c5d-9481-b69908565c43/volumes" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.460062 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee242362-b6d2-4bb8-82f6-57d8cd4f6497" path="/var/lib/kubelet/pods/ee242362-b6d2-4bb8-82f6-57d8cd4f6497/volumes" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.800477 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" event={"ID":"c042ac06-d4ba-45e8-830d-a3d13087097f","Type":"ContainerStarted","Data":"97618b7c67f6104bb9cfca2081e4d229e4351fa229b35830676b9eb2a9872e1b"} Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.801062 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.802755 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" event={"ID":"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb","Type":"ContainerStarted","Data":"3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4"} Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.803241 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.805587 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.808803 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.822027 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" podStartSLOduration=4.82200867 podStartE2EDuration="4.82200867s" podCreationTimestamp="2025-12-04 09:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:45.820355758 +0000 UTC m=+330.686836125" watchObservedRunningTime="2025-12-04 09:44:45.82200867 +0000 UTC m=+330.688489047" Dec 04 09:44:45 crc kubenswrapper[4776]: I1204 09:44:45.841848 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" podStartSLOduration=4.841828781 podStartE2EDuration="4.841828781s" podCreationTimestamp="2025-12-04 09:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:44:45.837691091 +0000 UTC m=+330.704171478" watchObservedRunningTime="2025-12-04 09:44:45.841828781 +0000 UTC m=+330.708309158" Dec 04 09:44:49 crc kubenswrapper[4776]: I1204 09:44:49.379870 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:44:49 crc kubenswrapper[4776]: I1204 09:44:49.380205 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.169985 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt"] Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.171364 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.174334 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.174442 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.180575 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt"] Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.203555 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cf991e7-ff27-441a-b83f-a70a66455185-secret-volume\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.203625 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cf991e7-ff27-441a-b83f-a70a66455185-config-volume\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.203669 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vmp\" (UniqueName: \"kubernetes.io/projected/3cf991e7-ff27-441a-b83f-a70a66455185-kube-api-access-s9vmp\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.305014 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cf991e7-ff27-441a-b83f-a70a66455185-secret-volume\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.305103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cf991e7-ff27-441a-b83f-a70a66455185-config-volume\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.305150 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vmp\" (UniqueName: \"kubernetes.io/projected/3cf991e7-ff27-441a-b83f-a70a66455185-kube-api-access-s9vmp\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.306192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cf991e7-ff27-441a-b83f-a70a66455185-config-volume\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.316448 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cf991e7-ff27-441a-b83f-a70a66455185-secret-volume\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.324759 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vmp\" (UniqueName: \"kubernetes.io/projected/3cf991e7-ff27-441a-b83f-a70a66455185-kube-api-access-s9vmp\") pod \"collect-profiles-29414025-jmfgt\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.493114 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:00 crc kubenswrapper[4776]: I1204 09:45:00.888581 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt"] Dec 04 09:45:00 crc kubenswrapper[4776]: W1204 09:45:00.892693 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf991e7_ff27_441a_b83f_a70a66455185.slice/crio-8a21b7c05c6da55645333628f01a282561f6bd123b39ed4204e55dd5c8dcf959 WatchSource:0}: Error finding container 8a21b7c05c6da55645333628f01a282561f6bd123b39ed4204e55dd5c8dcf959: Status 404 returned error can't find the container with id 8a21b7c05c6da55645333628f01a282561f6bd123b39ed4204e55dd5c8dcf959 Dec 04 09:45:01 crc kubenswrapper[4776]: I1204 09:45:01.896432 4776 generic.go:334] "Generic (PLEG): container finished" podID="3cf991e7-ff27-441a-b83f-a70a66455185" containerID="376f5b5e11af2c4defbeec654c99370acc674280766a7cdd9957440bad5b85c2" exitCode=0 Dec 04 09:45:01 crc kubenswrapper[4776]: I1204 09:45:01.896682 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" event={"ID":"3cf991e7-ff27-441a-b83f-a70a66455185","Type":"ContainerDied","Data":"376f5b5e11af2c4defbeec654c99370acc674280766a7cdd9957440bad5b85c2"} Dec 04 09:45:01 crc kubenswrapper[4776]: I1204 09:45:01.896763 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" event={"ID":"3cf991e7-ff27-441a-b83f-a70a66455185","Type":"ContainerStarted","Data":"8a21b7c05c6da55645333628f01a282561f6bd123b39ed4204e55dd5c8dcf959"} Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.167452 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.341401 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cf991e7-ff27-441a-b83f-a70a66455185-config-volume\") pod \"3cf991e7-ff27-441a-b83f-a70a66455185\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.341570 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cf991e7-ff27-441a-b83f-a70a66455185-secret-volume\") pod \"3cf991e7-ff27-441a-b83f-a70a66455185\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.341627 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9vmp\" (UniqueName: \"kubernetes.io/projected/3cf991e7-ff27-441a-b83f-a70a66455185-kube-api-access-s9vmp\") pod \"3cf991e7-ff27-441a-b83f-a70a66455185\" (UID: \"3cf991e7-ff27-441a-b83f-a70a66455185\") " Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.342245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf991e7-ff27-441a-b83f-a70a66455185-config-volume" (OuterVolumeSpecName: "config-volume") pod "3cf991e7-ff27-441a-b83f-a70a66455185" (UID: "3cf991e7-ff27-441a-b83f-a70a66455185"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.346252 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf991e7-ff27-441a-b83f-a70a66455185-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3cf991e7-ff27-441a-b83f-a70a66455185" (UID: "3cf991e7-ff27-441a-b83f-a70a66455185"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.346325 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf991e7-ff27-441a-b83f-a70a66455185-kube-api-access-s9vmp" (OuterVolumeSpecName: "kube-api-access-s9vmp") pod "3cf991e7-ff27-441a-b83f-a70a66455185" (UID: "3cf991e7-ff27-441a-b83f-a70a66455185"). InnerVolumeSpecName "kube-api-access-s9vmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.443548 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cf991e7-ff27-441a-b83f-a70a66455185-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.443625 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9vmp\" (UniqueName: \"kubernetes.io/projected/3cf991e7-ff27-441a-b83f-a70a66455185-kube-api-access-s9vmp\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.443643 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cf991e7-ff27-441a-b83f-a70a66455185-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.908316 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" event={"ID":"3cf991e7-ff27-441a-b83f-a70a66455185","Type":"ContainerDied","Data":"8a21b7c05c6da55645333628f01a282561f6bd123b39ed4204e55dd5c8dcf959"} Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.908368 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a21b7c05c6da55645333628f01a282561f6bd123b39ed4204e55dd5c8dcf959" Dec 04 09:45:03 crc kubenswrapper[4776]: I1204 09:45:03.908374 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt" Dec 04 09:45:13 crc kubenswrapper[4776]: I1204 09:45:13.278062 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-674567bd9b-4hkds"] Dec 04 09:45:13 crc kubenswrapper[4776]: I1204 09:45:13.279793 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" podUID="c042ac06-d4ba-45e8-830d-a3d13087097f" containerName="controller-manager" containerID="cri-o://97618b7c67f6104bb9cfca2081e4d229e4351fa229b35830676b9eb2a9872e1b" gracePeriod=30 Dec 04 09:45:13 crc kubenswrapper[4776]: I1204 09:45:13.961711 4776 generic.go:334] "Generic (PLEG): container finished" podID="c042ac06-d4ba-45e8-830d-a3d13087097f" containerID="97618b7c67f6104bb9cfca2081e4d229e4351fa229b35830676b9eb2a9872e1b" exitCode=0 Dec 04 09:45:13 crc kubenswrapper[4776]: I1204 09:45:13.961832 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" event={"ID":"c042ac06-d4ba-45e8-830d-a3d13087097f","Type":"ContainerDied","Data":"97618b7c67f6104bb9cfca2081e4d229e4351fa229b35830676b9eb2a9872e1b"} Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.330518 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.357531 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2qtlf"] Dec 04 09:45:14 crc kubenswrapper[4776]: E1204 09:45:14.357758 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c042ac06-d4ba-45e8-830d-a3d13087097f" containerName="controller-manager" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.357771 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c042ac06-d4ba-45e8-830d-a3d13087097f" containerName="controller-manager" Dec 04 09:45:14 crc kubenswrapper[4776]: E1204 09:45:14.357794 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf991e7-ff27-441a-b83f-a70a66455185" containerName="collect-profiles" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.357800 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf991e7-ff27-441a-b83f-a70a66455185" containerName="collect-profiles" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.357888 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf991e7-ff27-441a-b83f-a70a66455185" containerName="collect-profiles" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.357906 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c042ac06-d4ba-45e8-830d-a3d13087097f" containerName="controller-manager" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.358320 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.383051 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2qtlf"] Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401467 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-client-ca\") pod \"c042ac06-d4ba-45e8-830d-a3d13087097f\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401506 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-proxy-ca-bundles\") pod \"c042ac06-d4ba-45e8-830d-a3d13087097f\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-config\") pod \"c042ac06-d4ba-45e8-830d-a3d13087097f\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401649 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr9sz\" (UniqueName: \"kubernetes.io/projected/c042ac06-d4ba-45e8-830d-a3d13087097f-kube-api-access-hr9sz\") pod \"c042ac06-d4ba-45e8-830d-a3d13087097f\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401667 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c042ac06-d4ba-45e8-830d-a3d13087097f-serving-cert\") pod \"c042ac06-d4ba-45e8-830d-a3d13087097f\" (UID: \"c042ac06-d4ba-45e8-830d-a3d13087097f\") " Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401803 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-serving-cert\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401831 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j44c9\" (UniqueName: \"kubernetes.io/projected/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-kube-api-access-j44c9\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401868 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-config\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401895 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-client-ca\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.401941 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-proxy-ca-bundles\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.402952 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c042ac06-d4ba-45e8-830d-a3d13087097f" (UID: "c042ac06-d4ba-45e8-830d-a3d13087097f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.403078 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-config" (OuterVolumeSpecName: "config") pod "c042ac06-d4ba-45e8-830d-a3d13087097f" (UID: "c042ac06-d4ba-45e8-830d-a3d13087097f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.403246 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-client-ca" (OuterVolumeSpecName: "client-ca") pod "c042ac06-d4ba-45e8-830d-a3d13087097f" (UID: "c042ac06-d4ba-45e8-830d-a3d13087097f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.408286 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c042ac06-d4ba-45e8-830d-a3d13087097f-kube-api-access-hr9sz" (OuterVolumeSpecName: "kube-api-access-hr9sz") pod "c042ac06-d4ba-45e8-830d-a3d13087097f" (UID: "c042ac06-d4ba-45e8-830d-a3d13087097f"). InnerVolumeSpecName "kube-api-access-hr9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.409098 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c042ac06-d4ba-45e8-830d-a3d13087097f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c042ac06-d4ba-45e8-830d-a3d13087097f" (UID: "c042ac06-d4ba-45e8-830d-a3d13087097f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.502857 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-proxy-ca-bundles\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.502958 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-serving-cert\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.502999 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j44c9\" (UniqueName: \"kubernetes.io/projected/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-kube-api-access-j44c9\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503047 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-config\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503076 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-client-ca\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503133 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503147 4776 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503166 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c042ac06-d4ba-45e8-830d-a3d13087097f-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503177 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr9sz\" (UniqueName: \"kubernetes.io/projected/c042ac06-d4ba-45e8-830d-a3d13087097f-kube-api-access-hr9sz\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.503189 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c042ac06-d4ba-45e8-830d-a3d13087097f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.504269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-client-ca\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.505034 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-config\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.506617 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-proxy-ca-bundles\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.507535 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-serving-cert\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.528872 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j44c9\" (UniqueName: \"kubernetes.io/projected/e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e-kube-api-access-j44c9\") pod \"controller-manager-6c667744dd-2qtlf\" (UID: \"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e\") " pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.679161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.879243 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c667744dd-2qtlf"] Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.968675 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" event={"ID":"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e","Type":"ContainerStarted","Data":"32c64ad3cd68c0d3501203894096e371ca8c4509cc085739038e8df3a8a3411b"} Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.970409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" event={"ID":"c042ac06-d4ba-45e8-830d-a3d13087097f","Type":"ContainerDied","Data":"11c7121ce6a4b6ea3ecdb61fe60e9ba0a64f2f3d9ccb7620b4dfbde0734fc33a"} Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.970451 4776 scope.go:117] "RemoveContainer" containerID="97618b7c67f6104bb9cfca2081e4d229e4351fa229b35830676b9eb2a9872e1b" Dec 04 09:45:14 crc kubenswrapper[4776]: I1204 09:45:14.970468 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:14.998787 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-674567bd9b-4hkds"] Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.003791 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-674567bd9b-4hkds"] Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.028021 4776 patch_prober.go:28] interesting pod/controller-manager-674567bd9b-4hkds container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.028109 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-674567bd9b-4hkds" podUID="c042ac06-d4ba-45e8-830d-a3d13087097f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.459077 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c042ac06-d4ba-45e8-830d-a3d13087097f" path="/var/lib/kubelet/pods/c042ac06-d4ba-45e8-830d-a3d13087097f/volumes" Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.979853 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" event={"ID":"e5882ea4-de98-41e2-bf0c-37c0c8ae5c6e","Type":"ContainerStarted","Data":"7c27305909591a2199adaa6de893d759193f8f39e272f665a23b2f3c20bba868"} Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.980444 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.984728 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" Dec 04 09:45:15 crc kubenswrapper[4776]: I1204 09:45:15.999619 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c667744dd-2qtlf" podStartSLOduration=2.999594413 podStartE2EDuration="2.999594413s" podCreationTimestamp="2025-12-04 09:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:45:15.994709029 +0000 UTC m=+360.861189406" watchObservedRunningTime="2025-12-04 09:45:15.999594413 +0000 UTC m=+360.866074800" Dec 04 09:45:19 crc kubenswrapper[4776]: I1204 09:45:19.380267 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:45:19 crc kubenswrapper[4776]: I1204 09:45:19.382268 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:45:33 crc kubenswrapper[4776]: I1204 09:45:33.292460 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9"] Dec 04 09:45:33 crc kubenswrapper[4776]: I1204 09:45:33.293266 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" podUID="44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" containerName="route-controller-manager" containerID="cri-o://3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4" gracePeriod=30 Dec 04 09:45:33 crc kubenswrapper[4776]: I1204 09:45:33.860557 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.049705 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-config\") pod \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.049853 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-client-ca\") pod \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.049884 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhq2g\" (UniqueName: \"kubernetes.io/projected/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-kube-api-access-bhq2g\") pod \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.050030 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-serving-cert\") pod \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\" (UID: \"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb\") " Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.050906 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-client-ca" (OuterVolumeSpecName: "client-ca") pod "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" (UID: "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.050937 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-config" (OuterVolumeSpecName: "config") pod "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" (UID: "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.055737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" (UID: "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.055844 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-kube-api-access-bhq2g" (OuterVolumeSpecName: "kube-api-access-bhq2g") pod "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" (UID: "44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb"). InnerVolumeSpecName "kube-api-access-bhq2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.076971 4776 generic.go:334] "Generic (PLEG): container finished" podID="44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" containerID="3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4" exitCode=0 Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.077013 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" event={"ID":"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb","Type":"ContainerDied","Data":"3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4"} Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.077051 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" event={"ID":"44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb","Type":"ContainerDied","Data":"da1d7dca769f5d789c70d15a9c70d1d84e19c67df7b2f4342b0a68fc49dd70b5"} Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.077075 4776 scope.go:117] "RemoveContainer" containerID="3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.077083 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.105505 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9"] Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.107993 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c568499db-cdvg9"] Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.110437 4776 scope.go:117] "RemoveContainer" containerID="3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4" Dec 04 09:45:34 crc kubenswrapper[4776]: E1204 09:45:34.110982 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4\": container with ID starting with 3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4 not found: ID does not exist" containerID="3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.111020 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4"} err="failed to get container status \"3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4\": rpc error: code = NotFound desc = could not find container \"3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4\": container with ID starting with 3b333217dd3f883f83b1766d06bfe68a641f0b5f96a67cb4f4c8e3fc06deadf4 not found: ID does not exist" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.151985 4776 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.152018 4776 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.152032 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhq2g\" (UniqueName: \"kubernetes.io/projected/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-kube-api-access-bhq2g\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:34 crc kubenswrapper[4776]: I1204 09:45:34.152045 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.078303 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x"] Dec 04 09:45:35 crc kubenswrapper[4776]: E1204 09:45:35.078584 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" containerName="route-controller-manager" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.078599 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" containerName="route-controller-manager" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.078718 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" containerName="route-controller-manager" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.079277 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.081064 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.081682 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.082394 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.082587 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.086432 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.086963 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.091310 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x"] Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.264598 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98354c03-cda3-4614-9da0-2a8d30095fcd-serving-cert\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.265043 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnp6\" (UniqueName: \"kubernetes.io/projected/98354c03-cda3-4614-9da0-2a8d30095fcd-kube-api-access-8qnp6\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.265096 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98354c03-cda3-4614-9da0-2a8d30095fcd-client-ca\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.265272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98354c03-cda3-4614-9da0-2a8d30095fcd-config\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.366038 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98354c03-cda3-4614-9da0-2a8d30095fcd-serving-cert\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.366090 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnp6\" (UniqueName: \"kubernetes.io/projected/98354c03-cda3-4614-9da0-2a8d30095fcd-kube-api-access-8qnp6\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.366118 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98354c03-cda3-4614-9da0-2a8d30095fcd-client-ca\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.366156 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98354c03-cda3-4614-9da0-2a8d30095fcd-config\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.367375 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98354c03-cda3-4614-9da0-2a8d30095fcd-client-ca\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.367468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98354c03-cda3-4614-9da0-2a8d30095fcd-config\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.374799 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98354c03-cda3-4614-9da0-2a8d30095fcd-serving-cert\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.382979 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnp6\" (UniqueName: \"kubernetes.io/projected/98354c03-cda3-4614-9da0-2a8d30095fcd-kube-api-access-8qnp6\") pod \"route-controller-manager-7788564f84-8qc5x\" (UID: \"98354c03-cda3-4614-9da0-2a8d30095fcd\") " pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.394803 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.464970 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb" path="/var/lib/kubelet/pods/44f1acd9-24cf-4823-aaf5-c2b4f43bc8cb/volumes" Dec 04 09:45:35 crc kubenswrapper[4776]: I1204 09:45:35.793970 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x"] Dec 04 09:45:36 crc kubenswrapper[4776]: I1204 09:45:36.096107 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" event={"ID":"98354c03-cda3-4614-9da0-2a8d30095fcd","Type":"ContainerStarted","Data":"95d3e4f6a822d0f2da4cd153a5b41871e693b781a32208598ce49fba775fcc73"} Dec 04 09:45:37 crc kubenswrapper[4776]: I1204 09:45:37.102892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" event={"ID":"98354c03-cda3-4614-9da0-2a8d30095fcd","Type":"ContainerStarted","Data":"5677a45606789bcbe20d60280dbb53af795f8caf496be6dc65745945b3c29070"} Dec 04 09:45:37 crc kubenswrapper[4776]: I1204 09:45:37.103183 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:37 crc kubenswrapper[4776]: I1204 09:45:37.109503 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" Dec 04 09:45:37 crc kubenswrapper[4776]: I1204 09:45:37.126612 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7788564f84-8qc5x" podStartSLOduration=4.126582847 podStartE2EDuration="4.126582847s" podCreationTimestamp="2025-12-04 09:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:45:37.121197859 +0000 UTC m=+381.987678266" watchObservedRunningTime="2025-12-04 09:45:37.126582847 +0000 UTC m=+381.993063224" Dec 04 09:45:49 crc kubenswrapper[4776]: I1204 09:45:49.379468 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:45:49 crc kubenswrapper[4776]: I1204 09:45:49.380097 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:45:49 crc kubenswrapper[4776]: I1204 09:45:49.380161 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:45:49 crc kubenswrapper[4776]: I1204 09:45:49.380748 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:45:49 crc kubenswrapper[4776]: I1204 09:45:49.380802 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99" gracePeriod=600 Dec 04 09:45:49 crc kubenswrapper[4776]: E1204 09:45:49.470414 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57f7940_a976_4c85_bcb7_a1c24ba08266.slice/crio-52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57f7940_a976_4c85_bcb7_a1c24ba08266.slice/crio-conmon-52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99.scope\": RecentStats: unable to find data in memory cache]" Dec 04 09:45:50 crc kubenswrapper[4776]: I1204 09:45:50.174944 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99" exitCode=0 Dec 04 09:45:50 crc kubenswrapper[4776]: I1204 09:45:50.175044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99"} Dec 04 09:45:50 crc kubenswrapper[4776]: I1204 09:45:50.175233 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"ac3e98a2af945b6af89c1faf1bb20724affb9b8a861a16a105f52cbc2e74a78a"} Dec 04 09:45:50 crc kubenswrapper[4776]: I1204 09:45:50.175258 4776 scope.go:117] "RemoveContainer" containerID="f92d939606c6e048cb6a405d60ef14c409f7007f91afce409d99977948c4a500" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.037419 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhnrh"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.050700 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j2289"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.051104 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j2289" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="registry-server" containerID="cri-o://7684e23bc7cd3d6c1fb96eb424902f659852d4e9dbb8c437674156f61d0254d2" gracePeriod=30 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.056276 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-smxws"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.056615 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" containerID="cri-o://90bbdb3013f0417cd8348f6623a5fc85c154b460de3b4249280c6275a8a11d4d" gracePeriod=30 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.061382 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrzln"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.061652 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vrzln" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="registry-server" containerID="cri-o://79316d230ff1fb6fae0646366771fc060367b6f3c611122ae14424a8e5099834" gracePeriod=30 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.067482 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m27lb"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.067788 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m27lb" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="registry-server" containerID="cri-o://56b0ccf9a6ed6f4bff8d89041564e5f74d64b2a22d5ac919e087cdc41100ebbe" gracePeriod=30 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.075525 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nvzbs"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.076427 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.105252 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nvzbs"] Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.211417 4776 generic.go:334] "Generic (PLEG): container finished" podID="719304d2-2416-40be-b76a-ca884c683161" containerID="7684e23bc7cd3d6c1fb96eb424902f659852d4e9dbb8c437674156f61d0254d2" exitCode=0 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.213044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2289" event={"ID":"719304d2-2416-40be-b76a-ca884c683161","Type":"ContainerDied","Data":"7684e23bc7cd3d6c1fb96eb424902f659852d4e9dbb8c437674156f61d0254d2"} Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.213435 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.213517 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.213560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lg6\" (UniqueName: \"kubernetes.io/projected/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-kube-api-access-l7lg6\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.221904 4776 generic.go:334] "Generic (PLEG): container finished" podID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerID="90bbdb3013f0417cd8348f6623a5fc85c154b460de3b4249280c6275a8a11d4d" exitCode=0 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.221993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" event={"ID":"1e450f38-92b1-4da3-8cb6-353756403eb6","Type":"ContainerDied","Data":"90bbdb3013f0417cd8348f6623a5fc85c154b460de3b4249280c6275a8a11d4d"} Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.222042 4776 scope.go:117] "RemoveContainer" containerID="143f2f0293a3c439f83258d2d51d12e0a8d50d98009d90d4bed6731e554750ce" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.235679 4776 generic.go:334] "Generic (PLEG): container finished" podID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerID="79316d230ff1fb6fae0646366771fc060367b6f3c611122ae14424a8e5099834" exitCode=0 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.235796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrzln" event={"ID":"b1d11071-0dee-4b5f-989e-36b89f0eb26f","Type":"ContainerDied","Data":"79316d230ff1fb6fae0646366771fc060367b6f3c611122ae14424a8e5099834"} Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.245297 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerID="56b0ccf9a6ed6f4bff8d89041564e5f74d64b2a22d5ac919e087cdc41100ebbe" exitCode=0 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.245543 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bhnrh" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="registry-server" containerID="cri-o://97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265" gracePeriod=30 Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.245653 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerDied","Data":"56b0ccf9a6ed6f4bff8d89041564e5f74d64b2a22d5ac919e087cdc41100ebbe"} Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.314247 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.314325 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lg6\" (UniqueName: \"kubernetes.io/projected/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-kube-api-access-l7lg6\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.314386 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.316346 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.326466 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.346843 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lg6\" (UniqueName: \"kubernetes.io/projected/d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8-kube-api-access-l7lg6\") pod \"marketplace-operator-79b997595-nvzbs\" (UID: \"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.397329 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.654671 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.704277 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.740081 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.741702 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827372 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx4rb\" (UniqueName: \"kubernetes.io/projected/b1d11071-0dee-4b5f-989e-36b89f0eb26f-kube-api-access-mx4rb\") pod \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827711 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-catalog-content\") pod \"719304d2-2416-40be-b76a-ca884c683161\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827734 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-utilities\") pod \"0b861681-465a-4c51-8663-ecd652c7c7b0\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827766 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-operator-metrics\") pod \"1e450f38-92b1-4da3-8cb6-353756403eb6\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827818 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85g6g\" (UniqueName: \"kubernetes.io/projected/0b861681-465a-4c51-8663-ecd652c7c7b0-kube-api-access-85g6g\") pod \"0b861681-465a-4c51-8663-ecd652c7c7b0\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827848 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-catalog-content\") pod \"0b861681-465a-4c51-8663-ecd652c7c7b0\" (UID: \"0b861681-465a-4c51-8663-ecd652c7c7b0\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827876 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnwml\" (UniqueName: \"kubernetes.io/projected/1e450f38-92b1-4da3-8cb6-353756403eb6-kube-api-access-jnwml\") pod \"1e450f38-92b1-4da3-8cb6-353756403eb6\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827930 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srv4b\" (UniqueName: \"kubernetes.io/projected/719304d2-2416-40be-b76a-ca884c683161-kube-api-access-srv4b\") pod \"719304d2-2416-40be-b76a-ca884c683161\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827954 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-utilities\") pod \"719304d2-2416-40be-b76a-ca884c683161\" (UID: \"719304d2-2416-40be-b76a-ca884c683161\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.827994 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-trusted-ca\") pod \"1e450f38-92b1-4da3-8cb6-353756403eb6\" (UID: \"1e450f38-92b1-4da3-8cb6-353756403eb6\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.828018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-utilities\") pod \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.828056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-catalog-content\") pod \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\" (UID: \"b1d11071-0dee-4b5f-989e-36b89f0eb26f\") " Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.833294 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1e450f38-92b1-4da3-8cb6-353756403eb6" (UID: "1e450f38-92b1-4da3-8cb6-353756403eb6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.836127 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-utilities" (OuterVolumeSpecName: "utilities") pod "719304d2-2416-40be-b76a-ca884c683161" (UID: "719304d2-2416-40be-b76a-ca884c683161"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.837431 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-utilities" (OuterVolumeSpecName: "utilities") pod "0b861681-465a-4c51-8663-ecd652c7c7b0" (UID: "0b861681-465a-4c51-8663-ecd652c7c7b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.839263 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-utilities" (OuterVolumeSpecName: "utilities") pod "b1d11071-0dee-4b5f-989e-36b89f0eb26f" (UID: "b1d11071-0dee-4b5f-989e-36b89f0eb26f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.843859 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719304d2-2416-40be-b76a-ca884c683161-kube-api-access-srv4b" (OuterVolumeSpecName: "kube-api-access-srv4b") pod "719304d2-2416-40be-b76a-ca884c683161" (UID: "719304d2-2416-40be-b76a-ca884c683161"). InnerVolumeSpecName "kube-api-access-srv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.844191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e450f38-92b1-4da3-8cb6-353756403eb6-kube-api-access-jnwml" (OuterVolumeSpecName: "kube-api-access-jnwml") pod "1e450f38-92b1-4da3-8cb6-353756403eb6" (UID: "1e450f38-92b1-4da3-8cb6-353756403eb6"). InnerVolumeSpecName "kube-api-access-jnwml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.845424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b861681-465a-4c51-8663-ecd652c7c7b0-kube-api-access-85g6g" (OuterVolumeSpecName: "kube-api-access-85g6g") pod "0b861681-465a-4c51-8663-ecd652c7c7b0" (UID: "0b861681-465a-4c51-8663-ecd652c7c7b0"). InnerVolumeSpecName "kube-api-access-85g6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.845597 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1e450f38-92b1-4da3-8cb6-353756403eb6" (UID: "1e450f38-92b1-4da3-8cb6-353756403eb6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.844710 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d11071-0dee-4b5f-989e-36b89f0eb26f-kube-api-access-mx4rb" (OuterVolumeSpecName: "kube-api-access-mx4rb") pod "b1d11071-0dee-4b5f-989e-36b89f0eb26f" (UID: "b1d11071-0dee-4b5f-989e-36b89f0eb26f"). InnerVolumeSpecName "kube-api-access-mx4rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.854577 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1d11071-0dee-4b5f-989e-36b89f0eb26f" (UID: "b1d11071-0dee-4b5f-989e-36b89f0eb26f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.903970 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "719304d2-2416-40be-b76a-ca884c683161" (UID: "719304d2-2416-40be-b76a-ca884c683161"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929083 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85g6g\" (UniqueName: \"kubernetes.io/projected/0b861681-465a-4c51-8663-ecd652c7c7b0-kube-api-access-85g6g\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929120 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnwml\" (UniqueName: \"kubernetes.io/projected/1e450f38-92b1-4da3-8cb6-353756403eb6-kube-api-access-jnwml\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929132 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srv4b\" (UniqueName: \"kubernetes.io/projected/719304d2-2416-40be-b76a-ca884c683161-kube-api-access-srv4b\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929147 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929163 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929173 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929219 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1d11071-0dee-4b5f-989e-36b89f0eb26f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929232 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx4rb\" (UniqueName: \"kubernetes.io/projected/b1d11071-0dee-4b5f-989e-36b89f0eb26f-kube-api-access-mx4rb\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929243 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719304d2-2416-40be-b76a-ca884c683161-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929254 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929267 4776 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e450f38-92b1-4da3-8cb6-353756403eb6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.929973 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.978334 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b861681-465a-4c51-8663-ecd652c7c7b0" (UID: "0b861681-465a-4c51-8663-ecd652c7c7b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:53 crc kubenswrapper[4776]: I1204 09:45:53.988121 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nvzbs"] Dec 04 09:45:53 crc kubenswrapper[4776]: W1204 09:45:53.993800 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f6af2c_c9d1_4e7e_b102_cb6fb4c7fcf8.slice/crio-2f389be24c16f8d87b7c487c591ccb2adce5f3756380dfdafd9a8671bae84261 WatchSource:0}: Error finding container 2f389be24c16f8d87b7c487c591ccb2adce5f3756380dfdafd9a8671bae84261: Status 404 returned error can't find the container with id 2f389be24c16f8d87b7c487c591ccb2adce5f3756380dfdafd9a8671bae84261 Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.030183 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-utilities\") pod \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.030284 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cjh2\" (UniqueName: \"kubernetes.io/projected/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-kube-api-access-4cjh2\") pod \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.030353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-catalog-content\") pod \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\" (UID: \"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2\") " Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.030602 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b861681-465a-4c51-8663-ecd652c7c7b0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.031716 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-utilities" (OuterVolumeSpecName: "utilities") pod "e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" (UID: "e9c47789-6fcb-4f3d-9b38-99643a8fe1a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.035318 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-kube-api-access-4cjh2" (OuterVolumeSpecName: "kube-api-access-4cjh2") pod "e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" (UID: "e9c47789-6fcb-4f3d-9b38-99643a8fe1a2"). InnerVolumeSpecName "kube-api-access-4cjh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.084601 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" (UID: "e9c47789-6fcb-4f3d-9b38-99643a8fe1a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.131826 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.131876 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.131895 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cjh2\" (UniqueName: \"kubernetes.io/projected/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2-kube-api-access-4cjh2\") on node \"crc\" DevicePath \"\"" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.253669 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrzln" event={"ID":"b1d11071-0dee-4b5f-989e-36b89f0eb26f","Type":"ContainerDied","Data":"76ad2f758eaeab4ad1ece32bdbb03467c7400c6a6bad8d78a1334001789a9ebe"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.254061 4776 scope.go:117] "RemoveContainer" containerID="79316d230ff1fb6fae0646366771fc060367b6f3c611122ae14424a8e5099834" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.253729 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrzln" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.257406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m27lb" event={"ID":"0b861681-465a-4c51-8663-ecd652c7c7b0","Type":"ContainerDied","Data":"829dad834862ac28a0a471ed252b6237fa4fe9a8200ade4479b3fb8c7f2a56fe"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.257527 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m27lb" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.261542 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2289" event={"ID":"719304d2-2416-40be-b76a-ca884c683161","Type":"ContainerDied","Data":"f25f50779d373cdc6969f51388b7533403b4cacb0a6350dea29ce72c61173f01"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.261698 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2289" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.263977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" event={"ID":"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8","Type":"ContainerStarted","Data":"d8607a3f5a4fe85d3c124f8676780dfe81a2f0cc26f5f2aa727d640697e5db2f"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.264026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" event={"ID":"d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8","Type":"ContainerStarted","Data":"2f389be24c16f8d87b7c487c591ccb2adce5f3756380dfdafd9a8671bae84261"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.264233 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.266816 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nvzbs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.266866 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" podUID="d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.268091 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" event={"ID":"1e450f38-92b1-4da3-8cb6-353756403eb6","Type":"ContainerDied","Data":"bf7aece4e38f4cae0b218a3d5e17b4a190a3f6596abe80b56ea81c861be8e3a8"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.268192 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.272834 4776 scope.go:117] "RemoveContainer" containerID="9737a7bb927ed12b4315e23542e873e33f501e771ad00067da2edd2718ecfe34" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.274121 4776 generic.go:334] "Generic (PLEG): container finished" podID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerID="97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265" exitCode=0 Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.274161 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhnrh" event={"ID":"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2","Type":"ContainerDied","Data":"97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.274190 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhnrh" event={"ID":"e9c47789-6fcb-4f3d-9b38-99643a8fe1a2","Type":"ContainerDied","Data":"711fda8b721e0fa0a03189582178e53a5a8b28c791963a704633ba689b7a431e"} Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.274258 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhnrh" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.294059 4776 scope.go:117] "RemoveContainer" containerID="c61e64ef4fba8d4dc016ca435eaf294bbe559e959d6d04b739145aa8223d76ba" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.298342 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" podStartSLOduration=1.298318244 podStartE2EDuration="1.298318244s" podCreationTimestamp="2025-12-04 09:45:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:45:54.28645615 +0000 UTC m=+399.152936527" watchObservedRunningTime="2025-12-04 09:45:54.298318244 +0000 UTC m=+399.164798621" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.318619 4776 scope.go:117] "RemoveContainer" containerID="56b0ccf9a6ed6f4bff8d89041564e5f74d64b2a22d5ac919e087cdc41100ebbe" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.325279 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j2289"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.331303 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j2289"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.336438 4776 scope.go:117] "RemoveContainer" containerID="5c963b57f49bb60f960e521d3ab80fdda70adf881ba59a13c427cccae4ee07df" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.351425 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-smxws"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.360417 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-smxws"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.371688 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m27lb"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.376512 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m27lb"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.376701 4776 scope.go:117] "RemoveContainer" containerID="30a7f4fdc2694e6b429705e2425419df0c978072508fe674cdf2f2c37bc6e5f5" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.383311 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrzln"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.386505 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrzln"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.389868 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhnrh"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.392550 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bhnrh"] Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.397084 4776 scope.go:117] "RemoveContainer" containerID="7684e23bc7cd3d6c1fb96eb424902f659852d4e9dbb8c437674156f61d0254d2" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.417819 4776 scope.go:117] "RemoveContainer" containerID="0cc88e1a4fe819a4e16c47e47c63cccc8e674d2e250986cb902126922df77b96" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.441046 4776 scope.go:117] "RemoveContainer" containerID="17fb3685f4c4f718a37524327f1874fe66ffdd545c69b1eceecff349f9b9a12d" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.459346 4776 scope.go:117] "RemoveContainer" containerID="90bbdb3013f0417cd8348f6623a5fc85c154b460de3b4249280c6275a8a11d4d" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.478418 4776 scope.go:117] "RemoveContainer" containerID="97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.496690 4776 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-smxws container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.496763 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-smxws" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.503972 4776 scope.go:117] "RemoveContainer" containerID="9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.522203 4776 scope.go:117] "RemoveContainer" containerID="b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.536809 4776 scope.go:117] "RemoveContainer" containerID="97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265" Dec 04 09:45:54 crc kubenswrapper[4776]: E1204 09:45:54.537314 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265\": container with ID starting with 97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265 not found: ID does not exist" containerID="97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.537357 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265"} err="failed to get container status \"97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265\": rpc error: code = NotFound desc = could not find container \"97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265\": container with ID starting with 97f88c8096e0bf87982baa1059c381e8612ddc84d8b18e5e1c5ac3e3408b2265 not found: ID does not exist" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.537387 4776 scope.go:117] "RemoveContainer" containerID="9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce" Dec 04 09:45:54 crc kubenswrapper[4776]: E1204 09:45:54.537818 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce\": container with ID starting with 9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce not found: ID does not exist" containerID="9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.537942 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce"} err="failed to get container status \"9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce\": rpc error: code = NotFound desc = could not find container \"9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce\": container with ID starting with 9792a006f53b00fb0d095ab32ee3587c1918c06cfdcc1f1960505d7f9af976ce not found: ID does not exist" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.537958 4776 scope.go:117] "RemoveContainer" containerID="b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d" Dec 04 09:45:54 crc kubenswrapper[4776]: E1204 09:45:54.538245 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d\": container with ID starting with b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d not found: ID does not exist" containerID="b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d" Dec 04 09:45:54 crc kubenswrapper[4776]: I1204 09:45:54.538281 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d"} err="failed to get container status \"b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d\": rpc error: code = NotFound desc = could not find container \"b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d\": container with ID starting with b5ae8669b0ba21fd374df377207b8c59963d908e14ec6ccdc876f3ec1c5e9d3d not found: ID does not exist" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248458 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dsqhv"] Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248644 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248656 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248670 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248677 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248684 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248690 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248697 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248703 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248713 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248718 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248726 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248732 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248740 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248746 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248757 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248765 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248773 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248779 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248785 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248791 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248798 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248804 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248813 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248819 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248832 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248840 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="extract-utilities" Dec 04 09:45:55 crc kubenswrapper[4776]: E1204 09:45:55.248850 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248858 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="extract-content" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248966 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="719304d2-2416-40be-b76a-ca884c683161" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248976 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248985 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.248997 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.249003 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" containerName="registry-server" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.249151 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" containerName="marketplace-operator" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.249722 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.251342 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.260030 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsqhv"] Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.286637 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nvzbs" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.346897 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55lq\" (UniqueName: \"kubernetes.io/projected/573cfb4b-7da3-471b-b00a-5343818665c2-kube-api-access-p55lq\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.347035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573cfb4b-7da3-471b-b00a-5343818665c2-utilities\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.347122 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573cfb4b-7da3-471b-b00a-5343818665c2-catalog-content\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.448332 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55lq\" (UniqueName: \"kubernetes.io/projected/573cfb4b-7da3-471b-b00a-5343818665c2-kube-api-access-p55lq\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.448387 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573cfb4b-7da3-471b-b00a-5343818665c2-utilities\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.448444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573cfb4b-7da3-471b-b00a-5343818665c2-catalog-content\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.448877 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-df8j6"] Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.449202 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573cfb4b-7da3-471b-b00a-5343818665c2-catalog-content\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.450011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573cfb4b-7da3-471b-b00a-5343818665c2-utilities\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.450146 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.453097 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.466449 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b861681-465a-4c51-8663-ecd652c7c7b0" path="/var/lib/kubelet/pods/0b861681-465a-4c51-8663-ecd652c7c7b0/volumes" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.467280 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e450f38-92b1-4da3-8cb6-353756403eb6" path="/var/lib/kubelet/pods/1e450f38-92b1-4da3-8cb6-353756403eb6/volumes" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.467875 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719304d2-2416-40be-b76a-ca884c683161" path="/var/lib/kubelet/pods/719304d2-2416-40be-b76a-ca884c683161/volumes" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.469326 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d11071-0dee-4b5f-989e-36b89f0eb26f" path="/var/lib/kubelet/pods/b1d11071-0dee-4b5f-989e-36b89f0eb26f/volumes" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.471736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55lq\" (UniqueName: \"kubernetes.io/projected/573cfb4b-7da3-471b-b00a-5343818665c2-kube-api-access-p55lq\") pod \"redhat-marketplace-dsqhv\" (UID: \"573cfb4b-7da3-471b-b00a-5343818665c2\") " pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.478458 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c47789-6fcb-4f3d-9b38-99643a8fe1a2" path="/var/lib/kubelet/pods/e9c47789-6fcb-4f3d-9b38-99643a8fe1a2/volumes" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.480523 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-df8j6"] Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.549354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9k45\" (UniqueName: \"kubernetes.io/projected/6701794f-bfce-4c33-bcd4-08a8225ca4e3-kube-api-access-l9k45\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.549422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6701794f-bfce-4c33-bcd4-08a8225ca4e3-utilities\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.549508 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6701794f-bfce-4c33-bcd4-08a8225ca4e3-catalog-content\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.566641 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.650484 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9k45\" (UniqueName: \"kubernetes.io/projected/6701794f-bfce-4c33-bcd4-08a8225ca4e3-kube-api-access-l9k45\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.651311 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6701794f-bfce-4c33-bcd4-08a8225ca4e3-utilities\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.652230 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6701794f-bfce-4c33-bcd4-08a8225ca4e3-utilities\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.652379 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6701794f-bfce-4c33-bcd4-08a8225ca4e3-catalog-content\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.652689 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6701794f-bfce-4c33-bcd4-08a8225ca4e3-catalog-content\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.672431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9k45\" (UniqueName: \"kubernetes.io/projected/6701794f-bfce-4c33-bcd4-08a8225ca4e3-kube-api-access-l9k45\") pod \"redhat-operators-df8j6\" (UID: \"6701794f-bfce-4c33-bcd4-08a8225ca4e3\") " pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.773589 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:45:55 crc kubenswrapper[4776]: I1204 09:45:55.973375 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dsqhv"] Dec 04 09:45:55 crc kubenswrapper[4776]: W1204 09:45:55.982505 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573cfb4b_7da3_471b_b00a_5343818665c2.slice/crio-b7418c255390d93a13e40da5b9ff1148351e07a7f9f13ffc66a8c9a30034ed52 WatchSource:0}: Error finding container b7418c255390d93a13e40da5b9ff1148351e07a7f9f13ffc66a8c9a30034ed52: Status 404 returned error can't find the container with id b7418c255390d93a13e40da5b9ff1148351e07a7f9f13ffc66a8c9a30034ed52 Dec 04 09:45:56 crc kubenswrapper[4776]: I1204 09:45:56.150060 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-df8j6"] Dec 04 09:45:56 crc kubenswrapper[4776]: W1204 09:45:56.181039 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6701794f_bfce_4c33_bcd4_08a8225ca4e3.slice/crio-1ed607a5b67b7552bfbe49d3152fd2ee974670494b929c540899a04013e8c15c WatchSource:0}: Error finding container 1ed607a5b67b7552bfbe49d3152fd2ee974670494b929c540899a04013e8c15c: Status 404 returned error can't find the container with id 1ed607a5b67b7552bfbe49d3152fd2ee974670494b929c540899a04013e8c15c Dec 04 09:45:56 crc kubenswrapper[4776]: I1204 09:45:56.292134 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df8j6" event={"ID":"6701794f-bfce-4c33-bcd4-08a8225ca4e3","Type":"ContainerStarted","Data":"1ed607a5b67b7552bfbe49d3152fd2ee974670494b929c540899a04013e8c15c"} Dec 04 09:45:56 crc kubenswrapper[4776]: I1204 09:45:56.293879 4776 generic.go:334] "Generic (PLEG): container finished" podID="573cfb4b-7da3-471b-b00a-5343818665c2" containerID="690a3e4b5a6c970f8c18169670567b48f4200af8d55651c3a082bdff7bedc810" exitCode=0 Dec 04 09:45:56 crc kubenswrapper[4776]: I1204 09:45:56.293940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsqhv" event={"ID":"573cfb4b-7da3-471b-b00a-5343818665c2","Type":"ContainerDied","Data":"690a3e4b5a6c970f8c18169670567b48f4200af8d55651c3a082bdff7bedc810"} Dec 04 09:45:56 crc kubenswrapper[4776]: I1204 09:45:56.293990 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsqhv" event={"ID":"573cfb4b-7da3-471b-b00a-5343818665c2","Type":"ContainerStarted","Data":"b7418c255390d93a13e40da5b9ff1148351e07a7f9f13ffc66a8c9a30034ed52"} Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.300762 4776 generic.go:334] "Generic (PLEG): container finished" podID="6701794f-bfce-4c33-bcd4-08a8225ca4e3" containerID="106da1ae3976377db0dbff2af53aa37732587fe20bb1569d53bd050d76dc2cd4" exitCode=0 Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.300874 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df8j6" event={"ID":"6701794f-bfce-4c33-bcd4-08a8225ca4e3","Type":"ContainerDied","Data":"106da1ae3976377db0dbff2af53aa37732587fe20bb1569d53bd050d76dc2cd4"} Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.653395 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xk7pg"] Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.654738 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.658697 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.667259 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xk7pg"] Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.678241 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qgp\" (UniqueName: \"kubernetes.io/projected/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-kube-api-access-j2qgp\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.678402 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-catalog-content\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.678454 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-utilities\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.779050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qgp\" (UniqueName: \"kubernetes.io/projected/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-kube-api-access-j2qgp\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.779160 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-catalog-content\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.779190 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-utilities\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.779838 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-catalog-content\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.779979 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-utilities\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.804513 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qgp\" (UniqueName: \"kubernetes.io/projected/4d5e770b-fc01-43d8-9ebf-4d8a791330b7-kube-api-access-j2qgp\") pod \"certified-operators-xk7pg\" (UID: \"4d5e770b-fc01-43d8-9ebf-4d8a791330b7\") " pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.854169 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ncssp"] Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.855368 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.858556 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.866703 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncssp"] Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.880086 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd8145-8236-4f79-a7fe-67009d283ef5-catalog-content\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.880156 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd8145-8236-4f79-a7fe-67009d283ef5-utilities\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.880213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpv4v\" (UniqueName: \"kubernetes.io/projected/e7fd8145-8236-4f79-a7fe-67009d283ef5-kube-api-access-zpv4v\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.969933 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.980877 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd8145-8236-4f79-a7fe-67009d283ef5-catalog-content\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.980938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd8145-8236-4f79-a7fe-67009d283ef5-utilities\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.980988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpv4v\" (UniqueName: \"kubernetes.io/projected/e7fd8145-8236-4f79-a7fe-67009d283ef5-kube-api-access-zpv4v\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.981430 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7fd8145-8236-4f79-a7fe-67009d283ef5-catalog-content\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.981484 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7fd8145-8236-4f79-a7fe-67009d283ef5-utilities\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:57 crc kubenswrapper[4776]: I1204 09:45:57.999165 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpv4v\" (UniqueName: \"kubernetes.io/projected/e7fd8145-8236-4f79-a7fe-67009d283ef5-kube-api-access-zpv4v\") pod \"community-operators-ncssp\" (UID: \"e7fd8145-8236-4f79-a7fe-67009d283ef5\") " pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:58 crc kubenswrapper[4776]: I1204 09:45:58.176212 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:45:58 crc kubenswrapper[4776]: I1204 09:45:58.400389 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xk7pg"] Dec 04 09:45:58 crc kubenswrapper[4776]: W1204 09:45:58.409214 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5e770b_fc01_43d8_9ebf_4d8a791330b7.slice/crio-69de2934b1b63222e2550942661efc5cd1a51c99b3721d766472787403df610a WatchSource:0}: Error finding container 69de2934b1b63222e2550942661efc5cd1a51c99b3721d766472787403df610a: Status 404 returned error can't find the container with id 69de2934b1b63222e2550942661efc5cd1a51c99b3721d766472787403df610a Dec 04 09:45:58 crc kubenswrapper[4776]: I1204 09:45:58.600782 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncssp"] Dec 04 09:45:58 crc kubenswrapper[4776]: W1204 09:45:58.648064 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7fd8145_8236_4f79_a7fe_67009d283ef5.slice/crio-54e2373bb328d2412dae675ad38dd981b4a57b1ffd13be42ba0654011c54864b WatchSource:0}: Error finding container 54e2373bb328d2412dae675ad38dd981b4a57b1ffd13be42ba0654011c54864b: Status 404 returned error can't find the container with id 54e2373bb328d2412dae675ad38dd981b4a57b1ffd13be42ba0654011c54864b Dec 04 09:45:59 crc kubenswrapper[4776]: I1204 09:45:59.313841 4776 generic.go:334] "Generic (PLEG): container finished" podID="4d5e770b-fc01-43d8-9ebf-4d8a791330b7" containerID="47468424a9b5e0c3462d9c6a3d6a8b9da16c8d5471f493cc36852c7957a7a97d" exitCode=0 Dec 04 09:45:59 crc kubenswrapper[4776]: I1204 09:45:59.313965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk7pg" event={"ID":"4d5e770b-fc01-43d8-9ebf-4d8a791330b7","Type":"ContainerDied","Data":"47468424a9b5e0c3462d9c6a3d6a8b9da16c8d5471f493cc36852c7957a7a97d"} Dec 04 09:45:59 crc kubenswrapper[4776]: I1204 09:45:59.314284 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk7pg" event={"ID":"4d5e770b-fc01-43d8-9ebf-4d8a791330b7","Type":"ContainerStarted","Data":"69de2934b1b63222e2550942661efc5cd1a51c99b3721d766472787403df610a"} Dec 04 09:45:59 crc kubenswrapper[4776]: I1204 09:45:59.316748 4776 generic.go:334] "Generic (PLEG): container finished" podID="e7fd8145-8236-4f79-a7fe-67009d283ef5" containerID="8ab9da716bade90ac1c2917c6d55a68bd61b582594c10c7c1b6410942c2ed462" exitCode=0 Dec 04 09:45:59 crc kubenswrapper[4776]: I1204 09:45:59.316807 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncssp" event={"ID":"e7fd8145-8236-4f79-a7fe-67009d283ef5","Type":"ContainerDied","Data":"8ab9da716bade90ac1c2917c6d55a68bd61b582594c10c7c1b6410942c2ed462"} Dec 04 09:45:59 crc kubenswrapper[4776]: I1204 09:45:59.316864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncssp" event={"ID":"e7fd8145-8236-4f79-a7fe-67009d283ef5","Type":"ContainerStarted","Data":"54e2373bb328d2412dae675ad38dd981b4a57b1ffd13be42ba0654011c54864b"} Dec 04 09:45:59 crc kubenswrapper[4776]: E1204 09:45:59.610183 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573cfb4b_7da3_471b_b00a_5343818665c2.slice/crio-conmon-790312b48372e508ccdc866edbd13884209d0eed56d3b97af03b6c5282230bd3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573cfb4b_7da3_471b_b00a_5343818665c2.slice/crio-790312b48372e508ccdc866edbd13884209d0eed56d3b97af03b6c5282230bd3.scope\": RecentStats: unable to find data in memory cache]" Dec 04 09:46:00 crc kubenswrapper[4776]: I1204 09:46:00.325500 4776 generic.go:334] "Generic (PLEG): container finished" podID="6701794f-bfce-4c33-bcd4-08a8225ca4e3" containerID="15c709b640565f50c8f47d21f1133c90df9fe631a6cf52bc5e56ea19ed7469a6" exitCode=0 Dec 04 09:46:00 crc kubenswrapper[4776]: I1204 09:46:00.325654 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df8j6" event={"ID":"6701794f-bfce-4c33-bcd4-08a8225ca4e3","Type":"ContainerDied","Data":"15c709b640565f50c8f47d21f1133c90df9fe631a6cf52bc5e56ea19ed7469a6"} Dec 04 09:46:00 crc kubenswrapper[4776]: I1204 09:46:00.330837 4776 generic.go:334] "Generic (PLEG): container finished" podID="e7fd8145-8236-4f79-a7fe-67009d283ef5" containerID="c35ff76925330f58313c5323ba303d032f17eee1904e1bfb7391a1c94d6958a2" exitCode=0 Dec 04 09:46:00 crc kubenswrapper[4776]: I1204 09:46:00.330909 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncssp" event={"ID":"e7fd8145-8236-4f79-a7fe-67009d283ef5","Type":"ContainerDied","Data":"c35ff76925330f58313c5323ba303d032f17eee1904e1bfb7391a1c94d6958a2"} Dec 04 09:46:00 crc kubenswrapper[4776]: I1204 09:46:00.334902 4776 generic.go:334] "Generic (PLEG): container finished" podID="573cfb4b-7da3-471b-b00a-5343818665c2" containerID="790312b48372e508ccdc866edbd13884209d0eed56d3b97af03b6c5282230bd3" exitCode=0 Dec 04 09:46:00 crc kubenswrapper[4776]: I1204 09:46:00.334963 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsqhv" event={"ID":"573cfb4b-7da3-471b-b00a-5343818665c2","Type":"ContainerDied","Data":"790312b48372e508ccdc866edbd13884209d0eed56d3b97af03b6c5282230bd3"} Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.341819 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dsqhv" event={"ID":"573cfb4b-7da3-471b-b00a-5343818665c2","Type":"ContainerStarted","Data":"344675b2b0c545c93e3718d53d23fcd5327cc6bccda59734185ba76dbbe630dd"} Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.344241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df8j6" event={"ID":"6701794f-bfce-4c33-bcd4-08a8225ca4e3","Type":"ContainerStarted","Data":"0e082b7ac9ccee94d43978f99dcc647180e2086a6cf0c8cab36ee09eed99e850"} Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.350617 4776 generic.go:334] "Generic (PLEG): container finished" podID="4d5e770b-fc01-43d8-9ebf-4d8a791330b7" containerID="b8484ffcaa919e8de515e351a8c8dddd80c0c62286be5d0fb9d62d66bb036016" exitCode=0 Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.350749 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk7pg" event={"ID":"4d5e770b-fc01-43d8-9ebf-4d8a791330b7","Type":"ContainerDied","Data":"b8484ffcaa919e8de515e351a8c8dddd80c0c62286be5d0fb9d62d66bb036016"} Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.354503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncssp" event={"ID":"e7fd8145-8236-4f79-a7fe-67009d283ef5","Type":"ContainerStarted","Data":"7f458e8e20a7e4e687446d040eaf075b211172982554db9ea048d57ca2b070d7"} Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.377488 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dsqhv" podStartSLOduration=1.938443863 podStartE2EDuration="6.377462705s" podCreationTimestamp="2025-12-04 09:45:55 +0000 UTC" firstStartedPulling="2025-12-04 09:45:56.295693447 +0000 UTC m=+401.162173824" lastFinishedPulling="2025-12-04 09:46:00.734712279 +0000 UTC m=+405.601192666" observedRunningTime="2025-12-04 09:46:01.37257906 +0000 UTC m=+406.239059437" watchObservedRunningTime="2025-12-04 09:46:01.377462705 +0000 UTC m=+406.243943102" Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.420868 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-df8j6" podStartSLOduration=2.869752177 podStartE2EDuration="6.420843687s" podCreationTimestamp="2025-12-04 09:45:55 +0000 UTC" firstStartedPulling="2025-12-04 09:45:57.302607304 +0000 UTC m=+402.169087681" lastFinishedPulling="2025-12-04 09:46:00.853698824 +0000 UTC m=+405.720179191" observedRunningTime="2025-12-04 09:46:01.418900366 +0000 UTC m=+406.285380733" watchObservedRunningTime="2025-12-04 09:46:01.420843687 +0000 UTC m=+406.287324064" Dec 04 09:46:01 crc kubenswrapper[4776]: I1204 09:46:01.440292 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ncssp" podStartSLOduration=2.8275194580000003 podStartE2EDuration="4.440270202s" podCreationTimestamp="2025-12-04 09:45:57 +0000 UTC" firstStartedPulling="2025-12-04 09:45:59.318840034 +0000 UTC m=+404.185320421" lastFinishedPulling="2025-12-04 09:46:00.931590798 +0000 UTC m=+405.798071165" observedRunningTime="2025-12-04 09:46:01.439374064 +0000 UTC m=+406.305854451" watchObservedRunningTime="2025-12-04 09:46:01.440270202 +0000 UTC m=+406.306750579" Dec 04 09:46:04 crc kubenswrapper[4776]: I1204 09:46:04.376572 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk7pg" event={"ID":"4d5e770b-fc01-43d8-9ebf-4d8a791330b7","Type":"ContainerStarted","Data":"fdd521fb66b00a429eb21461d1ddba451bc631e2e1d9f73ab35562ef31228e5f"} Dec 04 09:46:04 crc kubenswrapper[4776]: I1204 09:46:04.395688 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xk7pg" podStartSLOduration=3.641275413 podStartE2EDuration="7.395666795s" podCreationTimestamp="2025-12-04 09:45:57 +0000 UTC" firstStartedPulling="2025-12-04 09:45:59.315215809 +0000 UTC m=+404.181696206" lastFinishedPulling="2025-12-04 09:46:03.069607211 +0000 UTC m=+407.936087588" observedRunningTime="2025-12-04 09:46:04.39549478 +0000 UTC m=+409.261975167" watchObservedRunningTime="2025-12-04 09:46:04.395666795 +0000 UTC m=+409.262147172" Dec 04 09:46:05 crc kubenswrapper[4776]: I1204 09:46:05.567292 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:46:05 crc kubenswrapper[4776]: I1204 09:46:05.567356 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:46:05 crc kubenswrapper[4776]: I1204 09:46:05.620512 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:46:05 crc kubenswrapper[4776]: I1204 09:46:05.774486 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:46:05 crc kubenswrapper[4776]: I1204 09:46:05.774566 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:46:06 crc kubenswrapper[4776]: I1204 09:46:06.431012 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dsqhv" Dec 04 09:46:06 crc kubenswrapper[4776]: I1204 09:46:06.813905 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df8j6" podUID="6701794f-bfce-4c33-bcd4-08a8225ca4e3" containerName="registry-server" probeResult="failure" output=< Dec 04 09:46:06 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 09:46:06 crc kubenswrapper[4776]: > Dec 04 09:46:07 crc kubenswrapper[4776]: I1204 09:46:07.971027 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:46:07 crc kubenswrapper[4776]: I1204 09:46:07.971086 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:46:08 crc kubenswrapper[4776]: I1204 09:46:08.013859 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:46:08 crc kubenswrapper[4776]: I1204 09:46:08.177677 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:46:08 crc kubenswrapper[4776]: I1204 09:46:08.178808 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:46:08 crc kubenswrapper[4776]: I1204 09:46:08.224169 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:46:08 crc kubenswrapper[4776]: I1204 09:46:08.443136 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xk7pg" Dec 04 09:46:08 crc kubenswrapper[4776]: I1204 09:46:08.447842 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ncssp" Dec 04 09:46:15 crc kubenswrapper[4776]: I1204 09:46:15.815784 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:46:15 crc kubenswrapper[4776]: I1204 09:46:15.856222 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-df8j6" Dec 04 09:47:49 crc kubenswrapper[4776]: I1204 09:47:49.380630 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:47:49 crc kubenswrapper[4776]: I1204 09:47:49.381224 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:48:19 crc kubenswrapper[4776]: I1204 09:48:19.379489 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:48:19 crc kubenswrapper[4776]: I1204 09:48:19.379982 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:48:49 crc kubenswrapper[4776]: I1204 09:48:49.379720 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:48:49 crc kubenswrapper[4776]: I1204 09:48:49.380307 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:48:49 crc kubenswrapper[4776]: I1204 09:48:49.380357 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:48:49 crc kubenswrapper[4776]: I1204 09:48:49.380818 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac3e98a2af945b6af89c1faf1bb20724affb9b8a861a16a105f52cbc2e74a78a"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:48:49 crc kubenswrapper[4776]: I1204 09:48:49.380872 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://ac3e98a2af945b6af89c1faf1bb20724affb9b8a861a16a105f52cbc2e74a78a" gracePeriod=600 Dec 04 09:48:50 crc kubenswrapper[4776]: I1204 09:48:50.288584 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="ac3e98a2af945b6af89c1faf1bb20724affb9b8a861a16a105f52cbc2e74a78a" exitCode=0 Dec 04 09:48:50 crc kubenswrapper[4776]: I1204 09:48:50.288675 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"ac3e98a2af945b6af89c1faf1bb20724affb9b8a861a16a105f52cbc2e74a78a"} Dec 04 09:48:50 crc kubenswrapper[4776]: I1204 09:48:50.288967 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"d12dd5f0f4d1f9a752fa1f4be19cc5b8f751ba5253eb00ebb21d382d93196690"} Dec 04 09:48:50 crc kubenswrapper[4776]: I1204 09:48:50.288988 4776 scope.go:117] "RemoveContainer" containerID="52063fc4909fedc01d5493701de9543a427e45e6bf2a8e7be858f79dd4a8bd99" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.244538 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdzzf"] Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.245975 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.262090 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdzzf"] Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356067 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-bound-sa-token\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356120 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-registry-tls\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356263 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356321 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llh9d\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-kube-api-access-llh9d\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356353 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/987a65ef-7818-4f00-8668-506e6add2943-trusted-ca\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356385 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/987a65ef-7818-4f00-8668-506e6add2943-registry-certificates\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356419 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/987a65ef-7818-4f00-8668-506e6add2943-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.356549 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/987a65ef-7818-4f00-8668-506e6add2943-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.375616 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458248 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-bound-sa-token\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-registry-tls\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458319 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llh9d\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-kube-api-access-llh9d\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458337 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/987a65ef-7818-4f00-8668-506e6add2943-trusted-ca\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458358 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/987a65ef-7818-4f00-8668-506e6add2943-registry-certificates\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458377 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/987a65ef-7818-4f00-8668-506e6add2943-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.458407 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/987a65ef-7818-4f00-8668-506e6add2943-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.459478 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/987a65ef-7818-4f00-8668-506e6add2943-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.460184 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/987a65ef-7818-4f00-8668-506e6add2943-registry-certificates\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.460489 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/987a65ef-7818-4f00-8668-506e6add2943-trusted-ca\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.464141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/987a65ef-7818-4f00-8668-506e6add2943-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.464587 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-registry-tls\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.476891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-bound-sa-token\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.477067 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llh9d\" (UniqueName: \"kubernetes.io/projected/987a65ef-7818-4f00-8668-506e6add2943-kube-api-access-llh9d\") pod \"image-registry-66df7c8f76-jdzzf\" (UID: \"987a65ef-7818-4f00-8668-506e6add2943\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.563354 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:22 crc kubenswrapper[4776]: I1204 09:50:22.813981 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdzzf"] Dec 04 09:50:23 crc kubenswrapper[4776]: I1204 09:50:23.806696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" event={"ID":"987a65ef-7818-4f00-8668-506e6add2943","Type":"ContainerStarted","Data":"1cc41d1a845a848dcf0461276ad34ae60636ffccea4e7189844d5664b7555223"} Dec 04 09:50:23 crc kubenswrapper[4776]: I1204 09:50:23.807101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" event={"ID":"987a65ef-7818-4f00-8668-506e6add2943","Type":"ContainerStarted","Data":"7135bf1afc6b68a97a42ca9561743b24de314e2fce10b9ca3900f83d9b6b7a07"} Dec 04 09:50:23 crc kubenswrapper[4776]: I1204 09:50:23.807125 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:23 crc kubenswrapper[4776]: I1204 09:50:23.832402 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" podStartSLOduration=1.832385154 podStartE2EDuration="1.832385154s" podCreationTimestamp="2025-12-04 09:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:50:23.828931925 +0000 UTC m=+668.695412322" watchObservedRunningTime="2025-12-04 09:50:23.832385154 +0000 UTC m=+668.698865531" Dec 04 09:50:42 crc kubenswrapper[4776]: I1204 09:50:42.568694 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jdzzf" Dec 04 09:50:42 crc kubenswrapper[4776]: I1204 09:50:42.629274 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkbvc"] Dec 04 09:50:49 crc kubenswrapper[4776]: I1204 09:50:49.379793 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:50:49 crc kubenswrapper[4776]: I1204 09:50:49.380539 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:51:07 crc kubenswrapper[4776]: I1204 09:51:07.681048 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" podUID="e358131f-46f1-40bc-9a4a-93798e8a303d" containerName="registry" containerID="cri-o://906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1" gracePeriod=30 Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.015183 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.054014 4776 generic.go:334] "Generic (PLEG): container finished" podID="e358131f-46f1-40bc-9a4a-93798e8a303d" containerID="906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1" exitCode=0 Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.054058 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" event={"ID":"e358131f-46f1-40bc-9a4a-93798e8a303d","Type":"ContainerDied","Data":"906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1"} Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.054090 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" event={"ID":"e358131f-46f1-40bc-9a4a-93798e8a303d","Type":"ContainerDied","Data":"c3cce169b3de727cd5feea63d91562bac72096cd5955e27a0a72e8811de7013a"} Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.054108 4776 scope.go:117] "RemoveContainer" containerID="906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.054274 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkbvc" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.070391 4776 scope.go:117] "RemoveContainer" containerID="906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1" Dec 04 09:51:08 crc kubenswrapper[4776]: E1204 09:51:08.070763 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1\": container with ID starting with 906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1 not found: ID does not exist" containerID="906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.070803 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1"} err="failed to get container status \"906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1\": rpc error: code = NotFound desc = could not find container \"906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1\": container with ID starting with 906749eca81dee8555bfc8c9ad048918e93268acc4b0da8f2b10adad6c4529e1 not found: ID does not exist" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.113898 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-bound-sa-token\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.113967 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e358131f-46f1-40bc-9a4a-93798e8a303d-installation-pull-secrets\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.114183 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.114297 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e358131f-46f1-40bc-9a4a-93798e8a303d-ca-trust-extracted\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.114319 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-tls\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.114337 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-certificates\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.114368 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgr2f\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-kube-api-access-bgr2f\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.114385 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-trusted-ca\") pod \"e358131f-46f1-40bc-9a4a-93798e8a303d\" (UID: \"e358131f-46f1-40bc-9a4a-93798e8a303d\") " Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.115751 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.116141 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.122558 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.122955 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e358131f-46f1-40bc-9a4a-93798e8a303d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.124187 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-kube-api-access-bgr2f" (OuterVolumeSpecName: "kube-api-access-bgr2f") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "kube-api-access-bgr2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.128561 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.131406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e358131f-46f1-40bc-9a4a-93798e8a303d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.132734 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e358131f-46f1-40bc-9a4a-93798e8a303d" (UID: "e358131f-46f1-40bc-9a4a-93798e8a303d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.214924 4776 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.214962 4776 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e358131f-46f1-40bc-9a4a-93798e8a303d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.214977 4776 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e358131f-46f1-40bc-9a4a-93798e8a303d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.214988 4776 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.214999 4776 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.215012 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgr2f\" (UniqueName: \"kubernetes.io/projected/e358131f-46f1-40bc-9a4a-93798e8a303d-kube-api-access-bgr2f\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.215024 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e358131f-46f1-40bc-9a4a-93798e8a303d-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.382735 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkbvc"] Dec 04 09:51:08 crc kubenswrapper[4776]: I1204 09:51:08.387424 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkbvc"] Dec 04 09:51:09 crc kubenswrapper[4776]: I1204 09:51:09.460700 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e358131f-46f1-40bc-9a4a-93798e8a303d" path="/var/lib/kubelet/pods/e358131f-46f1-40bc-9a4a-93798e8a303d/volumes" Dec 04 09:51:19 crc kubenswrapper[4776]: I1204 09:51:19.380283 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:51:19 crc kubenswrapper[4776]: I1204 09:51:19.380728 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:51:49 crc kubenswrapper[4776]: I1204 09:51:49.379999 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:51:49 crc kubenswrapper[4776]: I1204 09:51:49.380523 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:51:49 crc kubenswrapper[4776]: I1204 09:51:49.380578 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:51:49 crc kubenswrapper[4776]: I1204 09:51:49.381160 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d12dd5f0f4d1f9a752fa1f4be19cc5b8f751ba5253eb00ebb21d382d93196690"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:51:49 crc kubenswrapper[4776]: I1204 09:51:49.381218 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://d12dd5f0f4d1f9a752fa1f4be19cc5b8f751ba5253eb00ebb21d382d93196690" gracePeriod=600 Dec 04 09:51:50 crc kubenswrapper[4776]: I1204 09:51:50.329015 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="d12dd5f0f4d1f9a752fa1f4be19cc5b8f751ba5253eb00ebb21d382d93196690" exitCode=0 Dec 04 09:51:50 crc kubenswrapper[4776]: I1204 09:51:50.329102 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"d12dd5f0f4d1f9a752fa1f4be19cc5b8f751ba5253eb00ebb21d382d93196690"} Dec 04 09:51:50 crc kubenswrapper[4776]: I1204 09:51:50.329368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"7dee382b67ae6de15878aafacfd524a1e7ecdaa0880997ede900fe467e79e6d0"} Dec 04 09:51:50 crc kubenswrapper[4776]: I1204 09:51:50.329392 4776 scope.go:117] "RemoveContainer" containerID="ac3e98a2af945b6af89c1faf1bb20724affb9b8a861a16a105f52cbc2e74a78a" Dec 04 09:51:55 crc kubenswrapper[4776]: I1204 09:51:55.613812 4776 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.447311 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4vx5h"] Dec 04 09:52:11 crc kubenswrapper[4776]: E1204 09:52:11.448661 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e358131f-46f1-40bc-9a4a-93798e8a303d" containerName="registry" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.448686 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e358131f-46f1-40bc-9a4a-93798e8a303d" containerName="registry" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.448827 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e358131f-46f1-40bc-9a4a-93798e8a303d" containerName="registry" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.449463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.451807 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.451807 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.452550 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xbz6n" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.452927 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhbgn\" (UniqueName: \"kubernetes.io/projected/d19ecdb4-7502-46be-b833-c0f7608c5ce4-kube-api-access-hhbgn\") pod \"cert-manager-cainjector-7f985d654d-4vx5h\" (UID: \"d19ecdb4-7502-46be-b833-c0f7608c5ce4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.461110 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lxxbd"] Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.462015 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lxxbd" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.464418 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7zm25" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.472367 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-s9v8v"] Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.473532 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.476315 4776 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-clgbz" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.476670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4vx5h"] Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.489096 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lxxbd"] Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.493129 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-s9v8v"] Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.556085 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhbgn\" (UniqueName: \"kubernetes.io/projected/d19ecdb4-7502-46be-b833-c0f7608c5ce4-kube-api-access-hhbgn\") pod \"cert-manager-cainjector-7f985d654d-4vx5h\" (UID: \"d19ecdb4-7502-46be-b833-c0f7608c5ce4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.588696 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhbgn\" (UniqueName: \"kubernetes.io/projected/d19ecdb4-7502-46be-b833-c0f7608c5ce4-kube-api-access-hhbgn\") pod \"cert-manager-cainjector-7f985d654d-4vx5h\" (UID: \"d19ecdb4-7502-46be-b833-c0f7608c5ce4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.657751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk8bl\" (UniqueName: \"kubernetes.io/projected/4ac63db5-784c-4a99-a405-75c3d9f3909c-kube-api-access-bk8bl\") pod \"cert-manager-5b446d88c5-lxxbd\" (UID: \"4ac63db5-784c-4a99-a405-75c3d9f3909c\") " pod="cert-manager/cert-manager-5b446d88c5-lxxbd" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.658245 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h8vv\" (UniqueName: \"kubernetes.io/projected/688382e7-42ed-4f38-bd1e-3a0b40fa42bf-kube-api-access-6h8vv\") pod \"cert-manager-webhook-5655c58dd6-s9v8v\" (UID: \"688382e7-42ed-4f38-bd1e-3a0b40fa42bf\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.761891 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h8vv\" (UniqueName: \"kubernetes.io/projected/688382e7-42ed-4f38-bd1e-3a0b40fa42bf-kube-api-access-6h8vv\") pod \"cert-manager-webhook-5655c58dd6-s9v8v\" (UID: \"688382e7-42ed-4f38-bd1e-3a0b40fa42bf\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.761987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk8bl\" (UniqueName: \"kubernetes.io/projected/4ac63db5-784c-4a99-a405-75c3d9f3909c-kube-api-access-bk8bl\") pod \"cert-manager-5b446d88c5-lxxbd\" (UID: \"4ac63db5-784c-4a99-a405-75c3d9f3909c\") " pod="cert-manager/cert-manager-5b446d88c5-lxxbd" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.779551 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk8bl\" (UniqueName: \"kubernetes.io/projected/4ac63db5-784c-4a99-a405-75c3d9f3909c-kube-api-access-bk8bl\") pod \"cert-manager-5b446d88c5-lxxbd\" (UID: \"4ac63db5-784c-4a99-a405-75c3d9f3909c\") " pod="cert-manager/cert-manager-5b446d88c5-lxxbd" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.783088 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h8vv\" (UniqueName: \"kubernetes.io/projected/688382e7-42ed-4f38-bd1e-3a0b40fa42bf-kube-api-access-6h8vv\") pod \"cert-manager-webhook-5655c58dd6-s9v8v\" (UID: \"688382e7-42ed-4f38-bd1e-3a0b40fa42bf\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.821226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.827858 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lxxbd" Dec 04 09:52:11 crc kubenswrapper[4776]: I1204 09:52:11.863245 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.080847 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4vx5h"] Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.092528 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.181175 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-s9v8v"] Dec 04 09:52:12 crc kubenswrapper[4776]: W1204 09:52:12.189549 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod688382e7_42ed_4f38_bd1e_3a0b40fa42bf.slice/crio-683ea08610a43766c99f038bcf2dffc01ed75de1c1b02837a2b95a6e45322670 WatchSource:0}: Error finding container 683ea08610a43766c99f038bcf2dffc01ed75de1c1b02837a2b95a6e45322670: Status 404 returned error can't find the container with id 683ea08610a43766c99f038bcf2dffc01ed75de1c1b02837a2b95a6e45322670 Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.343536 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lxxbd"] Dec 04 09:52:12 crc kubenswrapper[4776]: W1204 09:52:12.350684 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ac63db5_784c_4a99_a405_75c3d9f3909c.slice/crio-bc4efc75a6ae7b7a5da67f7cf5239981fa5ab05b89cb9e8a7d31d57185653d65 WatchSource:0}: Error finding container bc4efc75a6ae7b7a5da67f7cf5239981fa5ab05b89cb9e8a7d31d57185653d65: Status 404 returned error can't find the container with id bc4efc75a6ae7b7a5da67f7cf5239981fa5ab05b89cb9e8a7d31d57185653d65 Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.461999 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lxxbd" event={"ID":"4ac63db5-784c-4a99-a405-75c3d9f3909c","Type":"ContainerStarted","Data":"bc4efc75a6ae7b7a5da67f7cf5239981fa5ab05b89cb9e8a7d31d57185653d65"} Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.463423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" event={"ID":"688382e7-42ed-4f38-bd1e-3a0b40fa42bf","Type":"ContainerStarted","Data":"683ea08610a43766c99f038bcf2dffc01ed75de1c1b02837a2b95a6e45322670"} Dec 04 09:52:12 crc kubenswrapper[4776]: I1204 09:52:12.464607 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" event={"ID":"d19ecdb4-7502-46be-b833-c0f7608c5ce4","Type":"ContainerStarted","Data":"c44fd3652901bd1f94f143254a4233a7e48549afe112918c344fcbe000aff5f1"} Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.503574 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lxxbd" event={"ID":"4ac63db5-784c-4a99-a405-75c3d9f3909c","Type":"ContainerStarted","Data":"d31a9fe9a93a8d2d844e00d960967238ad47ae3120b4e630c33bf3967510c089"} Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.506704 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" event={"ID":"688382e7-42ed-4f38-bd1e-3a0b40fa42bf","Type":"ContainerStarted","Data":"0f2cadf876e6dafc6e40bb408ae57e63f0fc98613725d3cb49713eb752488e2a"} Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.506825 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.512277 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" event={"ID":"d19ecdb4-7502-46be-b833-c0f7608c5ce4","Type":"ContainerStarted","Data":"3fe52c38022681c4d64b4c1168507a13ef924fd82f17e392f547d0a79a1ff592"} Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.523261 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-lxxbd" podStartSLOduration=2.452747661 podStartE2EDuration="5.52323901s" podCreationTimestamp="2025-12-04 09:52:11 +0000 UTC" firstStartedPulling="2025-12-04 09:52:12.354264118 +0000 UTC m=+777.220744495" lastFinishedPulling="2025-12-04 09:52:15.424755477 +0000 UTC m=+780.291235844" observedRunningTime="2025-12-04 09:52:16.51943603 +0000 UTC m=+781.385916427" watchObservedRunningTime="2025-12-04 09:52:16.52323901 +0000 UTC m=+781.389719387" Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.537447 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" podStartSLOduration=2.213901672 podStartE2EDuration="5.537415718s" podCreationTimestamp="2025-12-04 09:52:11 +0000 UTC" firstStartedPulling="2025-12-04 09:52:12.193467492 +0000 UTC m=+777.059947869" lastFinishedPulling="2025-12-04 09:52:15.516981538 +0000 UTC m=+780.383461915" observedRunningTime="2025-12-04 09:52:16.536428307 +0000 UTC m=+781.402908694" watchObservedRunningTime="2025-12-04 09:52:16.537415718 +0000 UTC m=+781.403896115" Dec 04 09:52:16 crc kubenswrapper[4776]: I1204 09:52:16.561262 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-4vx5h" podStartSLOduration=2.228811852 podStartE2EDuration="5.5612435s" podCreationTimestamp="2025-12-04 09:52:11 +0000 UTC" firstStartedPulling="2025-12-04 09:52:12.092304928 +0000 UTC m=+776.958785305" lastFinishedPulling="2025-12-04 09:52:15.424736576 +0000 UTC m=+780.291216953" observedRunningTime="2025-12-04 09:52:16.557573224 +0000 UTC m=+781.424053611" watchObservedRunningTime="2025-12-04 09:52:16.5612435 +0000 UTC m=+781.427723867" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.314789 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q6zk4"] Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.315713 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-controller" containerID="cri-o://e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.316225 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="sbdb" containerID="cri-o://e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.316304 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="nbdb" containerID="cri-o://d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.316360 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="northd" containerID="cri-o://eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.316413 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.316464 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-node" containerID="cri-o://e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.316515 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-acl-logging" containerID="cri-o://c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.359151 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" containerID="cri-o://5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" gracePeriod=30 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.549236 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/3.log" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.552954 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovn-acl-logging/0.log" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.553463 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovn-controller/0.log" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.553891 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" exitCode=0 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.553946 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6"} Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.553993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142"} Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.553959 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" exitCode=0 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.554018 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" exitCode=143 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.554036 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" exitCode=143 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.554086 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec"} Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.554098 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee"} Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.555956 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/2.log" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.557650 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/1.log" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.557729 4776 generic.go:334] "Generic (PLEG): container finished" podID="423f8d5c-40c6-4efe-935f-7a9373d6becd" containerID="9eec5c6913a7c0d28163dd5e108e453b81b8bf5a5912e26cd092d47ca0f21d13" exitCode=2 Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.557780 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerDied","Data":"9eec5c6913a7c0d28163dd5e108e453b81b8bf5a5912e26cd092d47ca0f21d13"} Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.557854 4776 scope.go:117] "RemoveContainer" containerID="0647b2bd40f4d09007d33837bceaec60188e87d77bae9c0f49a1d3c1a91d909c" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.558794 4776 scope.go:117] "RemoveContainer" containerID="9eec5c6913a7c0d28163dd5e108e453b81b8bf5a5912e26cd092d47ca0f21d13" Dec 04 09:52:21 crc kubenswrapper[4776]: I1204 09:52:21.868436 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-s9v8v" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.068047 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/3.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.070778 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovn-acl-logging/0.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.072318 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovn-controller/0.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.073005 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123040 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5mkz7"] Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123292 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123311 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123322 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123331 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123345 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123352 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123362 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123370 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123379 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123386 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123399 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="nbdb" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123406 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="nbdb" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123417 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-acl-logging" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123424 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-acl-logging" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123434 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-node" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123442 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-node" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123453 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kubecfg-setup" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123459 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kubecfg-setup" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123469 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="sbdb" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123476 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="sbdb" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123488 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123496 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123506 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="northd" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123512 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="northd" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123648 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-acl-logging" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123664 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123672 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-node" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123684 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="northd" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123695 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123705 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovn-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123716 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123725 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123735 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123745 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="nbdb" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123755 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="sbdb" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.123875 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.123885 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.124045 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerName="ovnkube-controller" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.125834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203263 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-ovn\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203308 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-ovn-kubernetes\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203343 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-netd\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203383 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203381 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-slash\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203407 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-slash" (OuterVolumeSpecName: "host-slash") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203413 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203426 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-openvswitch\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203448 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203502 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203541 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203810 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-script-lib\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203829 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-var-lib-openvswitch\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203847 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-log-socket\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203864 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-node-log\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203883 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-config\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203898 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-bin\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203940 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-systemd\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203967 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-env-overrides\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.203988 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-kubelet\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204008 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovn-node-metrics-cert\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204032 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-systemd-units\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204054 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-netns\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204072 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-etc-openvswitch\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204101 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vpnn\" (UniqueName: \"kubernetes.io/projected/fdc73cf8-973a-4254-9339-6c9f90c225bb-kube-api-access-6vpnn\") pod \"fdc73cf8-973a-4254-9339-6c9f90c225bb\" (UID: \"fdc73cf8-973a-4254-9339-6c9f90c225bb\") " Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovn-node-metrics-cert\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204186 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovnkube-config\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204208 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrhx\" (UniqueName: \"kubernetes.io/projected/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-kube-api-access-xbrhx\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204223 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-run-ovn-kubernetes\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204240 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-run-netns\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204256 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204278 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204294 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-kubelet\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-cni-bin\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204331 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-node-log\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204349 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovnkube-script-lib\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-var-lib-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204389 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-cni-netd\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204406 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-ovn\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-slash\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-etc-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204454 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-systemd-units\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204473 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-env-overrides\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204488 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-log-socket\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204504 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-systemd\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204545 4776 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204555 4776 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204564 4776 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204572 4776 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-slash\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204580 4776 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204588 4776 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.204967 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205003 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205067 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205090 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-log-socket" (OuterVolumeSpecName: "log-socket") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205109 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-node-log" (OuterVolumeSpecName: "node-log") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205139 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205139 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205175 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205355 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.205589 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.210832 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.211574 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc73cf8-973a-4254-9339-6c9f90c225bb-kube-api-access-6vpnn" (OuterVolumeSpecName: "kube-api-access-6vpnn") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "kube-api-access-6vpnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.219606 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fdc73cf8-973a-4254-9339-6c9f90c225bb" (UID: "fdc73cf8-973a-4254-9339-6c9f90c225bb"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305279 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305334 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-kubelet\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-cni-bin\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305378 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-node-log\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305395 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovnkube-script-lib\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305452 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-cni-bin\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-var-lib-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305480 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-node-log\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305510 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-cni-netd\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305491 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-var-lib-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305478 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-kubelet\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305559 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-cni-netd\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305605 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-ovn\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305633 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-slash\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305659 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-etc-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305686 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-systemd-units\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305714 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-env-overrides\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-log-socket\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305765 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-systemd\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305856 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovn-node-metrics-cert\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovnkube-config\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrhx\" (UniqueName: \"kubernetes.io/projected/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-kube-api-access-xbrhx\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305961 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-run-ovn-kubernetes\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.305996 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-run-netns\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306022 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306161 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306176 4776 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306189 4776 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-log-socket\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306201 4776 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-node-log\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306229 4776 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306241 4776 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306254 4776 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306261 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-ovn\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306266 4776 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc73cf8-973a-4254-9339-6c9f90c225bb-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-systemd\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306301 4776 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306324 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-slash\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306335 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc73cf8-973a-4254-9339-6c9f90c225bb-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306351 4776 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306356 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-etc-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306364 4776 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306377 4776 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdc73cf8-973a-4254-9339-6c9f90c225bb-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306387 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-systemd-units\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306391 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vpnn\" (UniqueName: \"kubernetes.io/projected/fdc73cf8-973a-4254-9339-6c9f90c225bb-kube-api-access-6vpnn\") on node \"crc\" DevicePath \"\"" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306239 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-log-socket\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306814 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-env-overrides\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306209 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovnkube-script-lib\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-run-netns\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306858 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-run-openvswitch\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.306904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-host-run-ovn-kubernetes\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.307859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovnkube-config\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.310407 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-ovn-node-metrics-cert\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.326013 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrhx\" (UniqueName: \"kubernetes.io/projected/c746d53e-e53c-4ffa-ad5d-f3b8d5900715-kube-api-access-xbrhx\") pod \"ovnkube-node-5mkz7\" (UID: \"c746d53e-e53c-4ffa-ad5d-f3b8d5900715\") " pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.439542 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:22 crc kubenswrapper[4776]: W1204 09:52:22.468872 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc746d53e_e53c_4ffa_ad5d_f3b8d5900715.slice/crio-f1f97e59f2e1752b0f5b2323e769bf79833b8a839a83e67b0dd4b70e723b6112 WatchSource:0}: Error finding container f1f97e59f2e1752b0f5b2323e769bf79833b8a839a83e67b0dd4b70e723b6112: Status 404 returned error can't find the container with id f1f97e59f2e1752b0f5b2323e769bf79833b8a839a83e67b0dd4b70e723b6112 Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.567782 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovnkube-controller/3.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.570793 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovn-acl-logging/0.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.571515 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6zk4_fdc73cf8-973a-4254-9339-6c9f90c225bb/ovn-controller/0.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.571939 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" exitCode=0 Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.571968 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" exitCode=0 Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.571977 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" exitCode=0 Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.571985 4776 generic.go:334] "Generic (PLEG): container finished" podID="fdc73cf8-973a-4254-9339-6c9f90c225bb" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" exitCode=0 Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572090 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572116 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572142 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572194 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572199 4776 scope.go:117] "RemoveContainer" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.572213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6zk4" event={"ID":"fdc73cf8-973a-4254-9339-6c9f90c225bb","Type":"ContainerDied","Data":"afe785251610872c9543291e705094727e6baef0f06e8dfb4e99cdaa291e722c"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.574883 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7xv6z_423f8d5c-40c6-4efe-935f-7a9373d6becd/kube-multus/2.log" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.574983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7xv6z" event={"ID":"423f8d5c-40c6-4efe-935f-7a9373d6becd","Type":"ContainerStarted","Data":"3ccd95537aeb59dee177246cda9f1441d30a37afd49175578a62a71b9e06d4ed"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.575894 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"f1f97e59f2e1752b0f5b2323e769bf79833b8a839a83e67b0dd4b70e723b6112"} Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.596382 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.616040 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q6zk4"] Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.620120 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q6zk4"] Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.645527 4776 scope.go:117] "RemoveContainer" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.711720 4776 scope.go:117] "RemoveContainer" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.727508 4776 scope.go:117] "RemoveContainer" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.741041 4776 scope.go:117] "RemoveContainer" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.755077 4776 scope.go:117] "RemoveContainer" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.771114 4776 scope.go:117] "RemoveContainer" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.782999 4776 scope.go:117] "RemoveContainer" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.799042 4776 scope.go:117] "RemoveContainer" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.811886 4776 scope.go:117] "RemoveContainer" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.812377 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": container with ID starting with 5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101 not found: ID does not exist" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.812409 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101"} err="failed to get container status \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": rpc error: code = NotFound desc = could not find container \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": container with ID starting with 5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.812432 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.812937 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": container with ID starting with 2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e not found: ID does not exist" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.812985 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e"} err="failed to get container status \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": rpc error: code = NotFound desc = could not find container \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": container with ID starting with 2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.813012 4776 scope.go:117] "RemoveContainer" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.813434 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": container with ID starting with e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c not found: ID does not exist" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.813466 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c"} err="failed to get container status \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": rpc error: code = NotFound desc = could not find container \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": container with ID starting with e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.813488 4776 scope.go:117] "RemoveContainer" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.813868 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": container with ID starting with d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919 not found: ID does not exist" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.813932 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919"} err="failed to get container status \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": rpc error: code = NotFound desc = could not find container \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": container with ID starting with d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.813968 4776 scope.go:117] "RemoveContainer" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.814475 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": container with ID starting with eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4 not found: ID does not exist" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.814502 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4"} err="failed to get container status \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": rpc error: code = NotFound desc = could not find container \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": container with ID starting with eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.814522 4776 scope.go:117] "RemoveContainer" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.814861 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": container with ID starting with 638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6 not found: ID does not exist" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.814893 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6"} err="failed to get container status \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": rpc error: code = NotFound desc = could not find container \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": container with ID starting with 638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.814908 4776 scope.go:117] "RemoveContainer" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.815241 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": container with ID starting with e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142 not found: ID does not exist" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.815690 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142"} err="failed to get container status \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": rpc error: code = NotFound desc = could not find container \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": container with ID starting with e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.815843 4776 scope.go:117] "RemoveContainer" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.816318 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": container with ID starting with c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec not found: ID does not exist" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.816345 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec"} err="failed to get container status \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": rpc error: code = NotFound desc = could not find container \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": container with ID starting with c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.816587 4776 scope.go:117] "RemoveContainer" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.817048 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": container with ID starting with e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee not found: ID does not exist" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.817080 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee"} err="failed to get container status \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": rpc error: code = NotFound desc = could not find container \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": container with ID starting with e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.817102 4776 scope.go:117] "RemoveContainer" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" Dec 04 09:52:22 crc kubenswrapper[4776]: E1204 09:52:22.817414 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": container with ID starting with b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2 not found: ID does not exist" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.817450 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2"} err="failed to get container status \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": rpc error: code = NotFound desc = could not find container \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": container with ID starting with b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.817474 4776 scope.go:117] "RemoveContainer" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.817747 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101"} err="failed to get container status \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": rpc error: code = NotFound desc = could not find container \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": container with ID starting with 5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.817768 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.818064 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e"} err="failed to get container status \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": rpc error: code = NotFound desc = could not find container \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": container with ID starting with 2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.818084 4776 scope.go:117] "RemoveContainer" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.818353 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c"} err="failed to get container status \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": rpc error: code = NotFound desc = could not find container \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": container with ID starting with e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.818372 4776 scope.go:117] "RemoveContainer" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.818778 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919"} err="failed to get container status \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": rpc error: code = NotFound desc = could not find container \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": container with ID starting with d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.818797 4776 scope.go:117] "RemoveContainer" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.819080 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4"} err="failed to get container status \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": rpc error: code = NotFound desc = could not find container \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": container with ID starting with eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.819097 4776 scope.go:117] "RemoveContainer" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.819520 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6"} err="failed to get container status \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": rpc error: code = NotFound desc = could not find container \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": container with ID starting with 638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.819544 4776 scope.go:117] "RemoveContainer" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.819816 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142"} err="failed to get container status \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": rpc error: code = NotFound desc = could not find container \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": container with ID starting with e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.819869 4776 scope.go:117] "RemoveContainer" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.820486 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec"} err="failed to get container status \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": rpc error: code = NotFound desc = could not find container \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": container with ID starting with c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.820538 4776 scope.go:117] "RemoveContainer" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.820821 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee"} err="failed to get container status \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": rpc error: code = NotFound desc = could not find container \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": container with ID starting with e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.820857 4776 scope.go:117] "RemoveContainer" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.821214 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2"} err="failed to get container status \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": rpc error: code = NotFound desc = could not find container \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": container with ID starting with b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.821240 4776 scope.go:117] "RemoveContainer" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.821526 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101"} err="failed to get container status \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": rpc error: code = NotFound desc = could not find container \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": container with ID starting with 5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.821549 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.821786 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e"} err="failed to get container status \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": rpc error: code = NotFound desc = could not find container \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": container with ID starting with 2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.821826 4776 scope.go:117] "RemoveContainer" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.822236 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c"} err="failed to get container status \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": rpc error: code = NotFound desc = could not find container \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": container with ID starting with e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.822258 4776 scope.go:117] "RemoveContainer" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.822477 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919"} err="failed to get container status \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": rpc error: code = NotFound desc = could not find container \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": container with ID starting with d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.822498 4776 scope.go:117] "RemoveContainer" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.822737 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4"} err="failed to get container status \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": rpc error: code = NotFound desc = could not find container \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": container with ID starting with eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.822758 4776 scope.go:117] "RemoveContainer" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823014 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6"} err="failed to get container status \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": rpc error: code = NotFound desc = could not find container \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": container with ID starting with 638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823048 4776 scope.go:117] "RemoveContainer" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823315 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142"} err="failed to get container status \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": rpc error: code = NotFound desc = could not find container \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": container with ID starting with e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823341 4776 scope.go:117] "RemoveContainer" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823548 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec"} err="failed to get container status \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": rpc error: code = NotFound desc = could not find container \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": container with ID starting with c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823567 4776 scope.go:117] "RemoveContainer" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823769 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee"} err="failed to get container status \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": rpc error: code = NotFound desc = could not find container \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": container with ID starting with e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.823795 4776 scope.go:117] "RemoveContainer" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824045 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2"} err="failed to get container status \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": rpc error: code = NotFound desc = could not find container \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": container with ID starting with b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824065 4776 scope.go:117] "RemoveContainer" containerID="5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824313 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101"} err="failed to get container status \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": rpc error: code = NotFound desc = could not find container \"5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101\": container with ID starting with 5fc95ec8a013a9ca5262c281938bbc18de7431f52b5a6de2185f9621ffde0101 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824330 4776 scope.go:117] "RemoveContainer" containerID="2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824666 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e"} err="failed to get container status \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": rpc error: code = NotFound desc = could not find container \"2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e\": container with ID starting with 2cad5314cf859fd8119332f5679ca34c5ffe49856da0f4c58c02a66fd2abdd5e not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824691 4776 scope.go:117] "RemoveContainer" containerID="e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824910 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c"} err="failed to get container status \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": rpc error: code = NotFound desc = could not find container \"e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c\": container with ID starting with e9f63e2848c28f58e52abbe19b81edb61302e5909a44962979675c0473b1105c not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.824947 4776 scope.go:117] "RemoveContainer" containerID="d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825165 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919"} err="failed to get container status \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": rpc error: code = NotFound desc = could not find container \"d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919\": container with ID starting with d17dbcc08ecb25219fb3442699c427e33a335df7a5ed933784f8fbc767597919 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825183 4776 scope.go:117] "RemoveContainer" containerID="eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825365 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4"} err="failed to get container status \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": rpc error: code = NotFound desc = could not find container \"eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4\": container with ID starting with eb54bf008640376ae7550d6e8dc55987ee58a0f501abdc376b50499c7f9e4ab4 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825383 4776 scope.go:117] "RemoveContainer" containerID="638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825646 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6"} err="failed to get container status \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": rpc error: code = NotFound desc = could not find container \"638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6\": container with ID starting with 638a8842f60d578ee01d2c5d2cf78bb0d68d130aee6ac23f26f04584eee2bfa6 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825664 4776 scope.go:117] "RemoveContainer" containerID="e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825841 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142"} err="failed to get container status \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": rpc error: code = NotFound desc = could not find container \"e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142\": container with ID starting with e20ec0dc0eabbb9f17ac01bc4d7981f97f6adfb3f924fe87a66f8a862ac8a142 not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.825857 4776 scope.go:117] "RemoveContainer" containerID="c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.826118 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec"} err="failed to get container status \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": rpc error: code = NotFound desc = could not find container \"c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec\": container with ID starting with c2a872cbad7e68218f9b50ff4cb3f2d48dbfad4230cf77afe3e55ec71ef643ec not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.826166 4776 scope.go:117] "RemoveContainer" containerID="e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.826358 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee"} err="failed to get container status \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": rpc error: code = NotFound desc = could not find container \"e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee\": container with ID starting with e621379b8b77146a72758fd91e79cd7ae186ce926ac6dccf405a69f18ecb04ee not found: ID does not exist" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.826392 4776 scope.go:117] "RemoveContainer" containerID="b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2" Dec 04 09:52:22 crc kubenswrapper[4776]: I1204 09:52:22.826684 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2"} err="failed to get container status \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": rpc error: code = NotFound desc = could not find container \"b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2\": container with ID starting with b82ce5b7125ee38590acfbaa80edef9fb43a96980d9a5bffefd2ceea994e12d2 not found: ID does not exist" Dec 04 09:52:23 crc kubenswrapper[4776]: I1204 09:52:23.460078 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc73cf8-973a-4254-9339-6c9f90c225bb" path="/var/lib/kubelet/pods/fdc73cf8-973a-4254-9339-6c9f90c225bb/volumes" Dec 04 09:52:23 crc kubenswrapper[4776]: I1204 09:52:23.582595 4776 generic.go:334] "Generic (PLEG): container finished" podID="c746d53e-e53c-4ffa-ad5d-f3b8d5900715" containerID="e369b69587bacd5c2f3c62b46302bf5247299f2c4acd32fc79d2f13bbf8319b1" exitCode=0 Dec 04 09:52:23 crc kubenswrapper[4776]: I1204 09:52:23.582669 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerDied","Data":"e369b69587bacd5c2f3c62b46302bf5247299f2c4acd32fc79d2f13bbf8319b1"} Dec 04 09:52:24 crc kubenswrapper[4776]: I1204 09:52:24.596036 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"3b0db886c6147efd142c39aa25a9bf4de06ba59b4baae1202987f9390977934d"} Dec 04 09:52:24 crc kubenswrapper[4776]: I1204 09:52:24.597600 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"e7436f5c0a6125dc9172ff0b322bce77b71cf8e8688876a838bc235cf4d3b407"} Dec 04 09:52:24 crc kubenswrapper[4776]: I1204 09:52:24.597704 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"9971ff0d5ef703b150c0cde05b143f094fe1af4569f0dfdb4a49bbfd6344b23f"} Dec 04 09:52:24 crc kubenswrapper[4776]: I1204 09:52:24.597781 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"4b04bd459345c2066e9e96e7851046b9e53d2b9d677f75b5b3ff672423a6da48"} Dec 04 09:52:24 crc kubenswrapper[4776]: I1204 09:52:24.597861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"1a7736b052f94fcf9aed923a61148db896f7f4cdf7e57710cb4df7a1ae4e8445"} Dec 04 09:52:24 crc kubenswrapper[4776]: I1204 09:52:24.597974 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"67367c1832174fa5ea08c8ae42e2354222399276926d41d5cbd83a4a9406a5cc"} Dec 04 09:52:26 crc kubenswrapper[4776]: I1204 09:52:26.611671 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"61ddfcfffb436b436af1a1eab1012ec8bd13c3af5eca258652df507a3c06101c"} Dec 04 09:52:29 crc kubenswrapper[4776]: I1204 09:52:29.636340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" event={"ID":"c746d53e-e53c-4ffa-ad5d-f3b8d5900715","Type":"ContainerStarted","Data":"d97bd2733f87114e0a69383a7ebf46897f63ca108af1c8c49ed5acf956a33e8b"} Dec 04 09:52:29 crc kubenswrapper[4776]: I1204 09:52:29.636731 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:29 crc kubenswrapper[4776]: I1204 09:52:29.676672 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:29 crc kubenswrapper[4776]: I1204 09:52:29.678670 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" podStartSLOduration=7.678658753 podStartE2EDuration="7.678658753s" podCreationTimestamp="2025-12-04 09:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:52:29.674627266 +0000 UTC m=+794.541107663" watchObservedRunningTime="2025-12-04 09:52:29.678658753 +0000 UTC m=+794.545139130" Dec 04 09:52:30 crc kubenswrapper[4776]: I1204 09:52:30.643737 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:30 crc kubenswrapper[4776]: I1204 09:52:30.644107 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:30 crc kubenswrapper[4776]: I1204 09:52:30.670131 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:52:52 crc kubenswrapper[4776]: I1204 09:52:52.467943 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5mkz7" Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.825637 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9"] Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.827247 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.830518 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.846458 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9"] Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.965807 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.965906 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp8vn\" (UniqueName: \"kubernetes.io/projected/41f02085-31f0-4a13-a020-8f55ce5e481b-kube-api-access-dp8vn\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:05 crc kubenswrapper[4776]: I1204 09:53:05.966038 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.066808 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp8vn\" (UniqueName: \"kubernetes.io/projected/41f02085-31f0-4a13-a020-8f55ce5e481b-kube-api-access-dp8vn\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.067170 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.067226 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.067725 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.067737 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.086331 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp8vn\" (UniqueName: \"kubernetes.io/projected/41f02085-31f0-4a13-a020-8f55ce5e481b-kube-api-access-dp8vn\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.153073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.357562 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9"] Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.828373 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" event={"ID":"41f02085-31f0-4a13-a020-8f55ce5e481b","Type":"ContainerStarted","Data":"c807ec557b261b38845097def6e58b5e7eac6f7057e5b70b7ace1b628814a4a6"} Dec 04 09:53:06 crc kubenswrapper[4776]: I1204 09:53:06.828433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" event={"ID":"41f02085-31f0-4a13-a020-8f55ce5e481b","Type":"ContainerStarted","Data":"fbd17501100b383ba7e5b7d62be54db8d16fbf7a90a3a830300b9ab10977c84d"} Dec 04 09:53:07 crc kubenswrapper[4776]: I1204 09:53:07.835087 4776 generic.go:334] "Generic (PLEG): container finished" podID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerID="c807ec557b261b38845097def6e58b5e7eac6f7057e5b70b7ace1b628814a4a6" exitCode=0 Dec 04 09:53:07 crc kubenswrapper[4776]: I1204 09:53:07.835147 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" event={"ID":"41f02085-31f0-4a13-a020-8f55ce5e481b","Type":"ContainerDied","Data":"c807ec557b261b38845097def6e58b5e7eac6f7057e5b70b7ace1b628814a4a6"} Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.189043 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xj9rg"] Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.191209 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.197552 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xj9rg"] Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.299455 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-catalog-content\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.299516 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-utilities\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.299561 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbpd\" (UniqueName: \"kubernetes.io/projected/211c82ce-9073-4b9b-bea1-8fd766044263-kube-api-access-lqbpd\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.400886 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-utilities\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.400978 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbpd\" (UniqueName: \"kubernetes.io/projected/211c82ce-9073-4b9b-bea1-8fd766044263-kube-api-access-lqbpd\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.401095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-catalog-content\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.401649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-utilities\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.401699 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-catalog-content\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.424881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbpd\" (UniqueName: \"kubernetes.io/projected/211c82ce-9073-4b9b-bea1-8fd766044263-kube-api-access-lqbpd\") pod \"redhat-operators-xj9rg\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.544041 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.765709 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xj9rg"] Dec 04 09:53:08 crc kubenswrapper[4776]: W1204 09:53:08.774137 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211c82ce_9073_4b9b_bea1_8fd766044263.slice/crio-2f93598ce04efcfedcd10c42b857599ca0bbfd5feb89fff830e4da24a254091a WatchSource:0}: Error finding container 2f93598ce04efcfedcd10c42b857599ca0bbfd5feb89fff830e4da24a254091a: Status 404 returned error can't find the container with id 2f93598ce04efcfedcd10c42b857599ca0bbfd5feb89fff830e4da24a254091a Dec 04 09:53:08 crc kubenswrapper[4776]: I1204 09:53:08.844256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerStarted","Data":"2f93598ce04efcfedcd10c42b857599ca0bbfd5feb89fff830e4da24a254091a"} Dec 04 09:53:09 crc kubenswrapper[4776]: I1204 09:53:09.852850 4776 generic.go:334] "Generic (PLEG): container finished" podID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerID="0f437b3d1cc108ffe448038fdfb3dbd4b09f2f26d2ce1335845704012416016a" exitCode=0 Dec 04 09:53:09 crc kubenswrapper[4776]: I1204 09:53:09.852959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" event={"ID":"41f02085-31f0-4a13-a020-8f55ce5e481b","Type":"ContainerDied","Data":"0f437b3d1cc108ffe448038fdfb3dbd4b09f2f26d2ce1335845704012416016a"} Dec 04 09:53:09 crc kubenswrapper[4776]: I1204 09:53:09.856167 4776 generic.go:334] "Generic (PLEG): container finished" podID="211c82ce-9073-4b9b-bea1-8fd766044263" containerID="df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460" exitCode=0 Dec 04 09:53:09 crc kubenswrapper[4776]: I1204 09:53:09.856214 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerDied","Data":"df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460"} Dec 04 09:53:10 crc kubenswrapper[4776]: I1204 09:53:10.864819 4776 generic.go:334] "Generic (PLEG): container finished" podID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerID="1f155c85d05d546b05722e1549480d70dc6455a57109b7df6f2b229d5a7819f0" exitCode=0 Dec 04 09:53:10 crc kubenswrapper[4776]: I1204 09:53:10.864965 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" event={"ID":"41f02085-31f0-4a13-a020-8f55ce5e481b","Type":"ContainerDied","Data":"1f155c85d05d546b05722e1549480d70dc6455a57109b7df6f2b229d5a7819f0"} Dec 04 09:53:11 crc kubenswrapper[4776]: I1204 09:53:11.874326 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerStarted","Data":"2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07"} Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.215671 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.348989 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-util\") pod \"41f02085-31f0-4a13-a020-8f55ce5e481b\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.349107 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-bundle\") pod \"41f02085-31f0-4a13-a020-8f55ce5e481b\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.349176 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp8vn\" (UniqueName: \"kubernetes.io/projected/41f02085-31f0-4a13-a020-8f55ce5e481b-kube-api-access-dp8vn\") pod \"41f02085-31f0-4a13-a020-8f55ce5e481b\" (UID: \"41f02085-31f0-4a13-a020-8f55ce5e481b\") " Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.349869 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-bundle" (OuterVolumeSpecName: "bundle") pod "41f02085-31f0-4a13-a020-8f55ce5e481b" (UID: "41f02085-31f0-4a13-a020-8f55ce5e481b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.379069 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f02085-31f0-4a13-a020-8f55ce5e481b-kube-api-access-dp8vn" (OuterVolumeSpecName: "kube-api-access-dp8vn") pod "41f02085-31f0-4a13-a020-8f55ce5e481b" (UID: "41f02085-31f0-4a13-a020-8f55ce5e481b"). InnerVolumeSpecName "kube-api-access-dp8vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.380571 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-util" (OuterVolumeSpecName: "util") pod "41f02085-31f0-4a13-a020-8f55ce5e481b" (UID: "41f02085-31f0-4a13-a020-8f55ce5e481b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.450782 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-util\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.450829 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41f02085-31f0-4a13-a020-8f55ce5e481b-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.450839 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp8vn\" (UniqueName: \"kubernetes.io/projected/41f02085-31f0-4a13-a020-8f55ce5e481b-kube-api-access-dp8vn\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.880872 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.880906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9" event={"ID":"41f02085-31f0-4a13-a020-8f55ce5e481b","Type":"ContainerDied","Data":"fbd17501100b383ba7e5b7d62be54db8d16fbf7a90a3a830300b9ab10977c84d"} Dec 04 09:53:12 crc kubenswrapper[4776]: I1204 09:53:12.880960 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd17501100b383ba7e5b7d62be54db8d16fbf7a90a3a830300b9ab10977c84d" Dec 04 09:53:13 crc kubenswrapper[4776]: I1204 09:53:13.888434 4776 generic.go:334] "Generic (PLEG): container finished" podID="211c82ce-9073-4b9b-bea1-8fd766044263" containerID="2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07" exitCode=0 Dec 04 09:53:13 crc kubenswrapper[4776]: I1204 09:53:13.888511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerDied","Data":"2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07"} Dec 04 09:53:14 crc kubenswrapper[4776]: I1204 09:53:14.897098 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerStarted","Data":"bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae"} Dec 04 09:53:14 crc kubenswrapper[4776]: I1204 09:53:14.914991 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xj9rg" podStartSLOduration=2.437924145 podStartE2EDuration="6.914963351s" podCreationTimestamp="2025-12-04 09:53:08 +0000 UTC" firstStartedPulling="2025-12-04 09:53:09.857962779 +0000 UTC m=+834.724443156" lastFinishedPulling="2025-12-04 09:53:14.335001985 +0000 UTC m=+839.201482362" observedRunningTime="2025-12-04 09:53:14.913075042 +0000 UTC m=+839.779555449" watchObservedRunningTime="2025-12-04 09:53:14.914963351 +0000 UTC m=+839.781443758" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.086721 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf"] Dec 04 09:53:16 crc kubenswrapper[4776]: E1204 09:53:16.088550 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="pull" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.088627 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="pull" Dec 04 09:53:16 crc kubenswrapper[4776]: E1204 09:53:16.088680 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="util" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.088726 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="util" Dec 04 09:53:16 crc kubenswrapper[4776]: E1204 09:53:16.088781 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="extract" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.088832 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="extract" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.088986 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f02085-31f0-4a13-a020-8f55ce5e481b" containerName="extract" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.089480 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.093251 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.093599 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cxcxc" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.093993 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf"] Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.094182 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.214129 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qqc\" (UniqueName: \"kubernetes.io/projected/af38e5f3-7acd-482c-9561-91789c242956-kube-api-access-l9qqc\") pod \"nmstate-operator-5b5b58f5c8-4rmmf\" (UID: \"af38e5f3-7acd-482c-9561-91789c242956\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.315360 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qqc\" (UniqueName: \"kubernetes.io/projected/af38e5f3-7acd-482c-9561-91789c242956-kube-api-access-l9qqc\") pod \"nmstate-operator-5b5b58f5c8-4rmmf\" (UID: \"af38e5f3-7acd-482c-9561-91789c242956\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.335438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qqc\" (UniqueName: \"kubernetes.io/projected/af38e5f3-7acd-482c-9561-91789c242956-kube-api-access-l9qqc\") pod \"nmstate-operator-5b5b58f5c8-4rmmf\" (UID: \"af38e5f3-7acd-482c-9561-91789c242956\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" Dec 04 09:53:16 crc kubenswrapper[4776]: I1204 09:53:16.403990 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" Dec 04 09:53:17 crc kubenswrapper[4776]: I1204 09:53:17.080519 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf"] Dec 04 09:53:17 crc kubenswrapper[4776]: W1204 09:53:17.089049 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf38e5f3_7acd_482c_9561_91789c242956.slice/crio-30cd02019d1dec09f3bbce9500230053a7ddf9b2f862c979180c3df4f2a8d698 WatchSource:0}: Error finding container 30cd02019d1dec09f3bbce9500230053a7ddf9b2f862c979180c3df4f2a8d698: Status 404 returned error can't find the container with id 30cd02019d1dec09f3bbce9500230053a7ddf9b2f862c979180c3df4f2a8d698 Dec 04 09:53:17 crc kubenswrapper[4776]: I1204 09:53:17.937896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" event={"ID":"af38e5f3-7acd-482c-9561-91789c242956","Type":"ContainerStarted","Data":"30cd02019d1dec09f3bbce9500230053a7ddf9b2f862c979180c3df4f2a8d698"} Dec 04 09:53:18 crc kubenswrapper[4776]: I1204 09:53:18.544627 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:18 crc kubenswrapper[4776]: I1204 09:53:18.545324 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:19 crc kubenswrapper[4776]: I1204 09:53:19.585072 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xj9rg" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="registry-server" probeResult="failure" output=< Dec 04 09:53:19 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 09:53:19 crc kubenswrapper[4776]: > Dec 04 09:53:21 crc kubenswrapper[4776]: I1204 09:53:21.967957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" event={"ID":"af38e5f3-7acd-482c-9561-91789c242956","Type":"ContainerStarted","Data":"c6f733355e5936723d5ebe5f99689ab4e79976a3a27275d25e790cf2fbd3e92a"} Dec 04 09:53:21 crc kubenswrapper[4776]: I1204 09:53:21.986091 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-4rmmf" podStartSLOduration=2.025351586 podStartE2EDuration="5.986070557s" podCreationTimestamp="2025-12-04 09:53:16 +0000 UTC" firstStartedPulling="2025-12-04 09:53:17.090747439 +0000 UTC m=+841.957227816" lastFinishedPulling="2025-12-04 09:53:21.05146641 +0000 UTC m=+845.917946787" observedRunningTime="2025-12-04 09:53:21.985230271 +0000 UTC m=+846.851710648" watchObservedRunningTime="2025-12-04 09:53:21.986070557 +0000 UTC m=+846.852550934" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.003802 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.005435 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.011335 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xcds5" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.013054 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.014207 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.016084 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.028483 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.033085 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-2bzm7"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.034605 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.039946 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.145965 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl7vn\" (UniqueName: \"kubernetes.io/projected/fe3aabeb-baf7-4d17-ab72-485cb4412799-kube-api-access-sl7vn\") pod \"nmstate-metrics-7f946cbc9-ffbgz\" (UID: \"fe3aabeb-baf7-4d17-ab72-485cb4412799\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.146022 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-nmstate-lock\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.146074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxzx\" (UniqueName: \"kubernetes.io/projected/f0f2d721-f66d-4f50-8b31-2a879a904faf-kube-api-access-dtxzx\") pod \"nmstate-webhook-5f6d4c5ccb-ns5br\" (UID: \"f0f2d721-f66d-4f50-8b31-2a879a904faf\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.146481 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-dbus-socket\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.146570 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-ovs-socket\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.146632 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg7lf\" (UniqueName: \"kubernetes.io/projected/99323994-d641-4eb4-b540-41bc2f5241ee-kube-api-access-wg7lf\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.146668 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f0f2d721-f66d-4f50-8b31-2a879a904faf-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ns5br\" (UID: \"f0f2d721-f66d-4f50-8b31-2a879a904faf\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.162471 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.163160 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.165394 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-897ln" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.165627 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.166238 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.184229 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxzx\" (UniqueName: \"kubernetes.io/projected/f0f2d721-f66d-4f50-8b31-2a879a904faf-kube-api-access-dtxzx\") pod \"nmstate-webhook-5f6d4c5ccb-ns5br\" (UID: \"f0f2d721-f66d-4f50-8b31-2a879a904faf\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248403 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-dbus-socket\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248442 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-ovs-socket\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248500 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg7lf\" (UniqueName: \"kubernetes.io/projected/99323994-d641-4eb4-b540-41bc2f5241ee-kube-api-access-wg7lf\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248537 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/86108b12-167c-4f7f-bbbf-566c1158e81c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248558 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f0f2d721-f66d-4f50-8b31-2a879a904faf-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ns5br\" (UID: \"f0f2d721-f66d-4f50-8b31-2a879a904faf\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcxs\" (UniqueName: \"kubernetes.io/projected/86108b12-167c-4f7f-bbbf-566c1158e81c-kube-api-access-bxcxs\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86108b12-167c-4f7f-bbbf-566c1158e81c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248682 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl7vn\" (UniqueName: \"kubernetes.io/projected/fe3aabeb-baf7-4d17-ab72-485cb4412799-kube-api-access-sl7vn\") pod \"nmstate-metrics-7f946cbc9-ffbgz\" (UID: \"fe3aabeb-baf7-4d17-ab72-485cb4412799\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248703 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-nmstate-lock\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248807 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-nmstate-lock\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.248861 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-ovs-socket\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.249042 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/99323994-d641-4eb4-b540-41bc2f5241ee-dbus-socket\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.263159 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f0f2d721-f66d-4f50-8b31-2a879a904faf-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ns5br\" (UID: \"f0f2d721-f66d-4f50-8b31-2a879a904faf\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.279824 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg7lf\" (UniqueName: \"kubernetes.io/projected/99323994-d641-4eb4-b540-41bc2f5241ee-kube-api-access-wg7lf\") pod \"nmstate-handler-2bzm7\" (UID: \"99323994-d641-4eb4-b540-41bc2f5241ee\") " pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.280456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl7vn\" (UniqueName: \"kubernetes.io/projected/fe3aabeb-baf7-4d17-ab72-485cb4412799-kube-api-access-sl7vn\") pod \"nmstate-metrics-7f946cbc9-ffbgz\" (UID: \"fe3aabeb-baf7-4d17-ab72-485cb4412799\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.283130 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxzx\" (UniqueName: \"kubernetes.io/projected/f0f2d721-f66d-4f50-8b31-2a879a904faf-kube-api-access-dtxzx\") pod \"nmstate-webhook-5f6d4c5ccb-ns5br\" (UID: \"f0f2d721-f66d-4f50-8b31-2a879a904faf\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.328630 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.341662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.349642 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/86108b12-167c-4f7f-bbbf-566c1158e81c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.349873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcxs\" (UniqueName: \"kubernetes.io/projected/86108b12-167c-4f7f-bbbf-566c1158e81c-kube-api-access-bxcxs\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.350140 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86108b12-167c-4f7f-bbbf-566c1158e81c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.351243 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86108b12-167c-4f7f-bbbf-566c1158e81c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.355142 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/86108b12-167c-4f7f-bbbf-566c1158e81c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.360495 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.380501 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcxs\" (UniqueName: \"kubernetes.io/projected/86108b12-167c-4f7f-bbbf-566c1158e81c-kube-api-access-bxcxs\") pod \"nmstate-console-plugin-7fbb5f6569-pndwq\" (UID: \"86108b12-167c-4f7f-bbbf-566c1158e81c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.397109 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5567b6b6fc-82ttt"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.397834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.416271 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5567b6b6fc-82ttt"] Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.450856 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-serving-cert\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.450943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-oauth-serving-cert\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.450990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-config\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.451021 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxrdp\" (UniqueName: \"kubernetes.io/projected/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-kube-api-access-rxrdp\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.451048 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-oauth-config\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.451090 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-service-ca\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.451198 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-trusted-ca-bundle\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.479254 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.555679 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-oauth-serving-cert\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.555733 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-config\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.555769 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrdp\" (UniqueName: \"kubernetes.io/projected/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-kube-api-access-rxrdp\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.555795 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-oauth-config\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.555856 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-service-ca\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.555907 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-trusted-ca-bundle\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.556690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-serving-cert\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.557161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-oauth-serving-cert\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.557777 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-config\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.559796 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-service-ca\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.560179 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-trusted-ca-bundle\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.564483 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-serving-cert\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.568430 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-console-oauth-config\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.577316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxrdp\" (UniqueName: \"kubernetes.io/projected/2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06-kube-api-access-rxrdp\") pod \"console-5567b6b6fc-82ttt\" (UID: \"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06\") " pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.653185 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz"] Dec 04 09:53:26 crc kubenswrapper[4776]: W1204 09:53:26.660990 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3aabeb_baf7_4d17_ab72_485cb4412799.slice/crio-226a232ee4a3ed0926cc959e1aacfbc076184bad0fb50aaf11eb336ccdab85e6 WatchSource:0}: Error finding container 226a232ee4a3ed0926cc959e1aacfbc076184bad0fb50aaf11eb336ccdab85e6: Status 404 returned error can't find the container with id 226a232ee4a3ed0926cc959e1aacfbc076184bad0fb50aaf11eb336ccdab85e6 Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.731622 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br"] Dec 04 09:53:26 crc kubenswrapper[4776]: W1204 09:53:26.737095 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f2d721_f66d_4f50_8b31_2a879a904faf.slice/crio-0f8c9c6ef327ccab371609802e87b45f8c10b3ea68661bbee3d6e57197aa13bc WatchSource:0}: Error finding container 0f8c9c6ef327ccab371609802e87b45f8c10b3ea68661bbee3d6e57197aa13bc: Status 404 returned error can't find the container with id 0f8c9c6ef327ccab371609802e87b45f8c10b3ea68661bbee3d6e57197aa13bc Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.741700 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:26 crc kubenswrapper[4776]: I1204 09:53:26.814902 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq"] Dec 04 09:53:27 crc kubenswrapper[4776]: I1204 09:53:27.006989 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" event={"ID":"fe3aabeb-baf7-4d17-ab72-485cb4412799","Type":"ContainerStarted","Data":"226a232ee4a3ed0926cc959e1aacfbc076184bad0fb50aaf11eb336ccdab85e6"} Dec 04 09:53:27 crc kubenswrapper[4776]: I1204 09:53:27.010137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" event={"ID":"86108b12-167c-4f7f-bbbf-566c1158e81c","Type":"ContainerStarted","Data":"40270a13ecca6dab2ea0cbdd29611517d86cbeba8977da51b928a063a67d04f4"} Dec 04 09:53:27 crc kubenswrapper[4776]: I1204 09:53:27.014209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" event={"ID":"f0f2d721-f66d-4f50-8b31-2a879a904faf","Type":"ContainerStarted","Data":"0f8c9c6ef327ccab371609802e87b45f8c10b3ea68661bbee3d6e57197aa13bc"} Dec 04 09:53:27 crc kubenswrapper[4776]: I1204 09:53:27.020511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2bzm7" event={"ID":"99323994-d641-4eb4-b540-41bc2f5241ee","Type":"ContainerStarted","Data":"d933ea03c79831495b4c761b3d11ee2efda558738b9db9cd0642bb16e66468fa"} Dec 04 09:53:27 crc kubenswrapper[4776]: I1204 09:53:27.074842 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5567b6b6fc-82ttt"] Dec 04 09:53:27 crc kubenswrapper[4776]: W1204 09:53:27.078128 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab4dcb1_9bc2_42dd_98eb_7b2c97d50e06.slice/crio-a975632d4108650c229537bd80bb0eda1fb45179d320453d66b8af13c9ea4592 WatchSource:0}: Error finding container a975632d4108650c229537bd80bb0eda1fb45179d320453d66b8af13c9ea4592: Status 404 returned error can't find the container with id a975632d4108650c229537bd80bb0eda1fb45179d320453d66b8af13c9ea4592 Dec 04 09:53:28 crc kubenswrapper[4776]: I1204 09:53:28.026378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5567b6b6fc-82ttt" event={"ID":"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06","Type":"ContainerStarted","Data":"a975632d4108650c229537bd80bb0eda1fb45179d320453d66b8af13c9ea4592"} Dec 04 09:53:28 crc kubenswrapper[4776]: I1204 09:53:28.601178 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:28 crc kubenswrapper[4776]: I1204 09:53:28.661460 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:28 crc kubenswrapper[4776]: I1204 09:53:28.839376 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xj9rg"] Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.042427 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5567b6b6fc-82ttt" event={"ID":"2ab4dcb1-9bc2-42dd-98eb-7b2c97d50e06","Type":"ContainerStarted","Data":"13cc1ae9ec777561c3df9a63b2d69a41c9a1a26b41d043ac7f1b5c5071734218"} Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.042752 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xj9rg" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="registry-server" containerID="cri-o://bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae" gracePeriod=2 Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.067256 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5567b6b6fc-82ttt" podStartSLOduration=4.067232965 podStartE2EDuration="4.067232965s" podCreationTimestamp="2025-12-04 09:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:53:30.067057109 +0000 UTC m=+854.933537486" watchObservedRunningTime="2025-12-04 09:53:30.067232965 +0000 UTC m=+854.933713342" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.447153 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.609493 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqbpd\" (UniqueName: \"kubernetes.io/projected/211c82ce-9073-4b9b-bea1-8fd766044263-kube-api-access-lqbpd\") pod \"211c82ce-9073-4b9b-bea1-8fd766044263\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.609653 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-catalog-content\") pod \"211c82ce-9073-4b9b-bea1-8fd766044263\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.609710 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-utilities\") pod \"211c82ce-9073-4b9b-bea1-8fd766044263\" (UID: \"211c82ce-9073-4b9b-bea1-8fd766044263\") " Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.612587 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-utilities" (OuterVolumeSpecName: "utilities") pod "211c82ce-9073-4b9b-bea1-8fd766044263" (UID: "211c82ce-9073-4b9b-bea1-8fd766044263"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.628018 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211c82ce-9073-4b9b-bea1-8fd766044263-kube-api-access-lqbpd" (OuterVolumeSpecName: "kube-api-access-lqbpd") pod "211c82ce-9073-4b9b-bea1-8fd766044263" (UID: "211c82ce-9073-4b9b-bea1-8fd766044263"). InnerVolumeSpecName "kube-api-access-lqbpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.711427 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqbpd\" (UniqueName: \"kubernetes.io/projected/211c82ce-9073-4b9b-bea1-8fd766044263-kube-api-access-lqbpd\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.711671 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.731873 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "211c82ce-9073-4b9b-bea1-8fd766044263" (UID: "211c82ce-9073-4b9b-bea1-8fd766044263"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:53:30 crc kubenswrapper[4776]: I1204 09:53:30.812769 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/211c82ce-9073-4b9b-bea1-8fd766044263-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.051561 4776 generic.go:334] "Generic (PLEG): container finished" podID="211c82ce-9073-4b9b-bea1-8fd766044263" containerID="bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae" exitCode=0 Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.051594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerDied","Data":"bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae"} Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.051645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xj9rg" event={"ID":"211c82ce-9073-4b9b-bea1-8fd766044263","Type":"ContainerDied","Data":"2f93598ce04efcfedcd10c42b857599ca0bbfd5feb89fff830e4da24a254091a"} Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.051651 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xj9rg" Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.051667 4776 scope.go:117] "RemoveContainer" containerID="bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae" Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.086497 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xj9rg"] Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.091604 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xj9rg"] Dec 04 09:53:31 crc kubenswrapper[4776]: I1204 09:53:31.460181 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" path="/var/lib/kubelet/pods/211c82ce-9073-4b9b-bea1-8fd766044263/volumes" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.111565 4776 scope.go:117] "RemoveContainer" containerID="2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.142234 4776 scope.go:117] "RemoveContainer" containerID="df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.176339 4776 scope.go:117] "RemoveContainer" containerID="bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae" Dec 04 09:53:32 crc kubenswrapper[4776]: E1204 09:53:32.177136 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae\": container with ID starting with bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae not found: ID does not exist" containerID="bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.177207 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae"} err="failed to get container status \"bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae\": rpc error: code = NotFound desc = could not find container \"bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae\": container with ID starting with bdf383ea26ca3a4a12716334f1b8b1b06b1ea7de77ea78c5eb1f7e8795a7faae not found: ID does not exist" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.177250 4776 scope.go:117] "RemoveContainer" containerID="2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07" Dec 04 09:53:32 crc kubenswrapper[4776]: E1204 09:53:32.178077 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07\": container with ID starting with 2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07 not found: ID does not exist" containerID="2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.178105 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07"} err="failed to get container status \"2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07\": rpc error: code = NotFound desc = could not find container \"2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07\": container with ID starting with 2d74f10d3b5052cc1a6ee8bbdae3502b3fe6db3c1157bf9c65d5a0113651db07 not found: ID does not exist" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.178125 4776 scope.go:117] "RemoveContainer" containerID="df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460" Dec 04 09:53:32 crc kubenswrapper[4776]: E1204 09:53:32.178501 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460\": container with ID starting with df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460 not found: ID does not exist" containerID="df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460" Dec 04 09:53:32 crc kubenswrapper[4776]: I1204 09:53:32.178555 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460"} err="failed to get container status \"df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460\": rpc error: code = NotFound desc = could not find container \"df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460\": container with ID starting with df3e3a13b4bd218917eac86eab88f79e5a3b76caf266e5fd0db76fa7f97c9460 not found: ID does not exist" Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.067042 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" event={"ID":"fe3aabeb-baf7-4d17-ab72-485cb4412799","Type":"ContainerStarted","Data":"e1d9f11e034840a2d6c6dabfbd36e6a80ce7984f2f410bd206803a2392261e06"} Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.070085 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" event={"ID":"86108b12-167c-4f7f-bbbf-566c1158e81c","Type":"ContainerStarted","Data":"619cda5fc860200f047f847ab92e75228e515b436b87a8cdd5a9ac10f37be98b"} Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.072841 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" event={"ID":"f0f2d721-f66d-4f50-8b31-2a879a904faf","Type":"ContainerStarted","Data":"641f9a85e19147a285701a3258922c7f9a88ac45074c5ee6c4140ecdc6c39740"} Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.073067 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.078535 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2bzm7" event={"ID":"99323994-d641-4eb4-b540-41bc2f5241ee","Type":"ContainerStarted","Data":"59df7f1fa0cd851550cca48f6bae9c37ec06a834cfc16988ee22d710e84c0e57"} Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.078682 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.093130 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pndwq" podStartSLOduration=1.735357004 podStartE2EDuration="7.093102902s" podCreationTimestamp="2025-12-04 09:53:26 +0000 UTC" firstStartedPulling="2025-12-04 09:53:26.828259583 +0000 UTC m=+851.694739960" lastFinishedPulling="2025-12-04 09:53:32.186005471 +0000 UTC m=+857.052485858" observedRunningTime="2025-12-04 09:53:33.088032552 +0000 UTC m=+857.954512929" watchObservedRunningTime="2025-12-04 09:53:33.093102902 +0000 UTC m=+857.959583279" Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.115072 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" podStartSLOduration=2.6504726830000003 podStartE2EDuration="8.115055432s" podCreationTimestamp="2025-12-04 09:53:25 +0000 UTC" firstStartedPulling="2025-12-04 09:53:26.741591956 +0000 UTC m=+851.608072333" lastFinishedPulling="2025-12-04 09:53:32.206174705 +0000 UTC m=+857.072655082" observedRunningTime="2025-12-04 09:53:33.109623221 +0000 UTC m=+857.976103598" watchObservedRunningTime="2025-12-04 09:53:33.115055432 +0000 UTC m=+857.981535809" Dec 04 09:53:33 crc kubenswrapper[4776]: I1204 09:53:33.131851 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-2bzm7" podStartSLOduration=1.3707828530000001 podStartE2EDuration="7.13182658s" podCreationTimestamp="2025-12-04 09:53:26 +0000 UTC" firstStartedPulling="2025-12-04 09:53:26.425514041 +0000 UTC m=+851.291994428" lastFinishedPulling="2025-12-04 09:53:32.186557788 +0000 UTC m=+857.053038155" observedRunningTime="2025-12-04 09:53:33.128027011 +0000 UTC m=+857.994507408" watchObservedRunningTime="2025-12-04 09:53:33.13182658 +0000 UTC m=+857.998306957" Dec 04 09:53:35 crc kubenswrapper[4776]: I1204 09:53:35.094376 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" event={"ID":"fe3aabeb-baf7-4d17-ab72-485cb4412799","Type":"ContainerStarted","Data":"30748c1d78a59c0fa8273ec9732439f01951e3781b65703b78a9b1cc1028d355"} Dec 04 09:53:35 crc kubenswrapper[4776]: I1204 09:53:35.122400 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-ffbgz" podStartSLOduration=1.921435073 podStartE2EDuration="10.122373851s" podCreationTimestamp="2025-12-04 09:53:25 +0000 UTC" firstStartedPulling="2025-12-04 09:53:26.664612913 +0000 UTC m=+851.531093290" lastFinishedPulling="2025-12-04 09:53:34.865551691 +0000 UTC m=+859.732032068" observedRunningTime="2025-12-04 09:53:35.115495465 +0000 UTC m=+859.981975842" watchObservedRunningTime="2025-12-04 09:53:35.122373851 +0000 UTC m=+859.988854228" Dec 04 09:53:36 crc kubenswrapper[4776]: I1204 09:53:36.743063 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:36 crc kubenswrapper[4776]: I1204 09:53:36.743549 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:36 crc kubenswrapper[4776]: I1204 09:53:36.748392 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:37 crc kubenswrapper[4776]: I1204 09:53:37.109672 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5567b6b6fc-82ttt" Dec 04 09:53:37 crc kubenswrapper[4776]: I1204 09:53:37.190350 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vm645"] Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.032748 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-96t5h"] Dec 04 09:53:40 crc kubenswrapper[4776]: E1204 09:53:40.033138 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="extract-content" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.033163 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="extract-content" Dec 04 09:53:40 crc kubenswrapper[4776]: E1204 09:53:40.033185 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="extract-utilities" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.033198 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="extract-utilities" Dec 04 09:53:40 crc kubenswrapper[4776]: E1204 09:53:40.033222 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="registry-server" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.033235 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="registry-server" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.033391 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="211c82ce-9073-4b9b-bea1-8fd766044263" containerName="registry-server" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.034998 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.048417 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96t5h"] Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.151813 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvsg\" (UniqueName: \"kubernetes.io/projected/af9787f6-72ee-4cca-965f-422a88e62ad0-kube-api-access-wcvsg\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.151872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-utilities\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.151962 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-catalog-content\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.252706 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvsg\" (UniqueName: \"kubernetes.io/projected/af9787f6-72ee-4cca-965f-422a88e62ad0-kube-api-access-wcvsg\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.252783 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-utilities\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.252828 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-catalog-content\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.253329 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-catalog-content\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.253379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-utilities\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.280739 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvsg\" (UniqueName: \"kubernetes.io/projected/af9787f6-72ee-4cca-965f-422a88e62ad0-kube-api-access-wcvsg\") pod \"community-operators-96t5h\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.396086 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:40 crc kubenswrapper[4776]: I1204 09:53:40.627546 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96t5h"] Dec 04 09:53:41 crc kubenswrapper[4776]: I1204 09:53:41.129635 4776 generic.go:334] "Generic (PLEG): container finished" podID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerID="490e07f0e99e2990bfaf744b79e46f4e81a4c6b964cd3f86249349d3c4b339ab" exitCode=0 Dec 04 09:53:41 crc kubenswrapper[4776]: I1204 09:53:41.129679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96t5h" event={"ID":"af9787f6-72ee-4cca-965f-422a88e62ad0","Type":"ContainerDied","Data":"490e07f0e99e2990bfaf744b79e46f4e81a4c6b964cd3f86249349d3c4b339ab"} Dec 04 09:53:41 crc kubenswrapper[4776]: I1204 09:53:41.129710 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96t5h" event={"ID":"af9787f6-72ee-4cca-965f-422a88e62ad0","Type":"ContainerStarted","Data":"09dcd268db0d6dad4b9b38bbf4e1062af10eedcd9d2b6c3ae4eee31de954ba39"} Dec 04 09:53:41 crc kubenswrapper[4776]: I1204 09:53:41.387885 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-2bzm7" Dec 04 09:53:43 crc kubenswrapper[4776]: I1204 09:53:43.568594 4776 generic.go:334] "Generic (PLEG): container finished" podID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerID="691b8c1963269f94434f7c0741c8521696bd630b2a2d026d8378ab33aab12f0e" exitCode=0 Dec 04 09:53:43 crc kubenswrapper[4776]: I1204 09:53:43.571881 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96t5h" event={"ID":"af9787f6-72ee-4cca-965f-422a88e62ad0","Type":"ContainerDied","Data":"691b8c1963269f94434f7c0741c8521696bd630b2a2d026d8378ab33aab12f0e"} Dec 04 09:53:44 crc kubenswrapper[4776]: I1204 09:53:44.582972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96t5h" event={"ID":"af9787f6-72ee-4cca-965f-422a88e62ad0","Type":"ContainerStarted","Data":"fe13ef5ae2cfa900bd7f46947c25957c6f4277ffe69f5fbcecec264150117fbc"} Dec 04 09:53:45 crc kubenswrapper[4776]: I1204 09:53:45.692358 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-96t5h" podStartSLOduration=2.547056825 podStartE2EDuration="5.692338419s" podCreationTimestamp="2025-12-04 09:53:40 +0000 UTC" firstStartedPulling="2025-12-04 09:53:41.131142164 +0000 UTC m=+865.997622541" lastFinishedPulling="2025-12-04 09:53:44.276423758 +0000 UTC m=+869.142904135" observedRunningTime="2025-12-04 09:53:45.689638174 +0000 UTC m=+870.556118551" watchObservedRunningTime="2025-12-04 09:53:45.692338419 +0000 UTC m=+870.558818796" Dec 04 09:53:46 crc kubenswrapper[4776]: I1204 09:53:46.347852 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ns5br" Dec 04 09:53:49 crc kubenswrapper[4776]: I1204 09:53:49.380056 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:53:49 crc kubenswrapper[4776]: I1204 09:53:49.380420 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:53:50 crc kubenswrapper[4776]: I1204 09:53:50.396380 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:50 crc kubenswrapper[4776]: I1204 09:53:50.397291 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:50 crc kubenswrapper[4776]: I1204 09:53:50.442521 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:50 crc kubenswrapper[4776]: I1204 09:53:50.664495 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:50 crc kubenswrapper[4776]: I1204 09:53:50.710330 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96t5h"] Dec 04 09:53:52 crc kubenswrapper[4776]: I1204 09:53:52.635310 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-96t5h" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="registry-server" containerID="cri-o://fe13ef5ae2cfa900bd7f46947c25957c6f4277ffe69f5fbcecec264150117fbc" gracePeriod=2 Dec 04 09:53:53 crc kubenswrapper[4776]: I1204 09:53:53.647512 4776 generic.go:334] "Generic (PLEG): container finished" podID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerID="fe13ef5ae2cfa900bd7f46947c25957c6f4277ffe69f5fbcecec264150117fbc" exitCode=0 Dec 04 09:53:53 crc kubenswrapper[4776]: I1204 09:53:53.647594 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96t5h" event={"ID":"af9787f6-72ee-4cca-965f-422a88e62ad0","Type":"ContainerDied","Data":"fe13ef5ae2cfa900bd7f46947c25957c6f4277ffe69f5fbcecec264150117fbc"} Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.133059 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.246578 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-catalog-content\") pod \"af9787f6-72ee-4cca-965f-422a88e62ad0\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.246644 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-utilities\") pod \"af9787f6-72ee-4cca-965f-422a88e62ad0\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.246670 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvsg\" (UniqueName: \"kubernetes.io/projected/af9787f6-72ee-4cca-965f-422a88e62ad0-kube-api-access-wcvsg\") pod \"af9787f6-72ee-4cca-965f-422a88e62ad0\" (UID: \"af9787f6-72ee-4cca-965f-422a88e62ad0\") " Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.248605 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-utilities" (OuterVolumeSpecName: "utilities") pod "af9787f6-72ee-4cca-965f-422a88e62ad0" (UID: "af9787f6-72ee-4cca-965f-422a88e62ad0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.254273 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9787f6-72ee-4cca-965f-422a88e62ad0-kube-api-access-wcvsg" (OuterVolumeSpecName: "kube-api-access-wcvsg") pod "af9787f6-72ee-4cca-965f-422a88e62ad0" (UID: "af9787f6-72ee-4cca-965f-422a88e62ad0"). InnerVolumeSpecName "kube-api-access-wcvsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.295141 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af9787f6-72ee-4cca-965f-422a88e62ad0" (UID: "af9787f6-72ee-4cca-965f-422a88e62ad0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.348249 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.348284 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9787f6-72ee-4cca-965f-422a88e62ad0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.348296 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvsg\" (UniqueName: \"kubernetes.io/projected/af9787f6-72ee-4cca-965f-422a88e62ad0-kube-api-access-wcvsg\") on node \"crc\" DevicePath \"\"" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.660855 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96t5h" event={"ID":"af9787f6-72ee-4cca-965f-422a88e62ad0","Type":"ContainerDied","Data":"09dcd268db0d6dad4b9b38bbf4e1062af10eedcd9d2b6c3ae4eee31de954ba39"} Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.660980 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96t5h" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.661240 4776 scope.go:117] "RemoveContainer" containerID="fe13ef5ae2cfa900bd7f46947c25957c6f4277ffe69f5fbcecec264150117fbc" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.725654 4776 scope.go:117] "RemoveContainer" containerID="691b8c1963269f94434f7c0741c8521696bd630b2a2d026d8378ab33aab12f0e" Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.750013 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96t5h"] Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.755174 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-96t5h"] Dec 04 09:53:54 crc kubenswrapper[4776]: I1204 09:53:54.758623 4776 scope.go:117] "RemoveContainer" containerID="490e07f0e99e2990bfaf744b79e46f4e81a4c6b964cd3f86249349d3c4b339ab" Dec 04 09:53:55 crc kubenswrapper[4776]: I1204 09:53:55.459576 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" path="/var/lib/kubelet/pods/af9787f6-72ee-4cca-965f-422a88e62ad0/volumes" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.933718 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt"] Dec 04 09:53:59 crc kubenswrapper[4776]: E1204 09:53:59.934455 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="extract-content" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.934469 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="extract-content" Dec 04 09:53:59 crc kubenswrapper[4776]: E1204 09:53:59.934481 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="extract-utilities" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.934487 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="extract-utilities" Dec 04 09:53:59 crc kubenswrapper[4776]: E1204 09:53:59.934498 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="registry-server" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.934505 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="registry-server" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.934598 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9787f6-72ee-4cca-965f-422a88e62ad0" containerName="registry-server" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.935370 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.937779 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 09:53:59 crc kubenswrapper[4776]: I1204 09:53:59.944830 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt"] Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.040700 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.040774 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.040828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68wz8\" (UniqueName: \"kubernetes.io/projected/99270406-e115-46c3-aecb-7155ec24ab04-kube-api-access-68wz8\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.142507 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.142669 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68wz8\" (UniqueName: \"kubernetes.io/projected/99270406-e115-46c3-aecb-7155ec24ab04-kube-api-access-68wz8\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.142713 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.143571 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.144120 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.167933 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68wz8\" (UniqueName: \"kubernetes.io/projected/99270406-e115-46c3-aecb-7155ec24ab04-kube-api-access-68wz8\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.257270 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.490111 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt"] Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.701297 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" event={"ID":"99270406-e115-46c3-aecb-7155ec24ab04","Type":"ContainerStarted","Data":"4a2ced527638d477da4fb2fb69338e013ad99f2e8cda9bf6c3cdbb7c4a66c490"} Dec 04 09:54:00 crc kubenswrapper[4776]: I1204 09:54:00.701842 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" event={"ID":"99270406-e115-46c3-aecb-7155ec24ab04","Type":"ContainerStarted","Data":"75aa0ed0abbed7f1e90fd1cf6d24071f626f4228aa9520ce239403b485c9bb08"} Dec 04 09:54:01 crc kubenswrapper[4776]: I1204 09:54:01.711531 4776 generic.go:334] "Generic (PLEG): container finished" podID="99270406-e115-46c3-aecb-7155ec24ab04" containerID="4a2ced527638d477da4fb2fb69338e013ad99f2e8cda9bf6c3cdbb7c4a66c490" exitCode=0 Dec 04 09:54:01 crc kubenswrapper[4776]: I1204 09:54:01.711592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" event={"ID":"99270406-e115-46c3-aecb-7155ec24ab04","Type":"ContainerDied","Data":"4a2ced527638d477da4fb2fb69338e013ad99f2e8cda9bf6c3cdbb7c4a66c490"} Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.238073 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vm645" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerName="console" containerID="cri-o://1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd" gracePeriod=15 Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.664518 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vm645_52d9a038-9fbd-4306-9e4a-00901ca865dc/console/0.log" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.664604 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.721699 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vm645_52d9a038-9fbd-4306-9e4a-00901ca865dc/console/0.log" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.721754 4776 generic.go:334] "Generic (PLEG): container finished" podID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerID="1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd" exitCode=2 Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.721784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vm645" event={"ID":"52d9a038-9fbd-4306-9e4a-00901ca865dc","Type":"ContainerDied","Data":"1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd"} Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.721808 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vm645" event={"ID":"52d9a038-9fbd-4306-9e4a-00901ca865dc","Type":"ContainerDied","Data":"1dfb36ebffe27c470511d5b2c9a5a649c52da7bb67564e7c069c5a8cb1b1bba6"} Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.721826 4776 scope.go:117] "RemoveContainer" containerID="1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.721859 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vm645" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.752477 4776 scope.go:117] "RemoveContainer" containerID="1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd" Dec 04 09:54:02 crc kubenswrapper[4776]: E1204 09:54:02.754378 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd\": container with ID starting with 1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd not found: ID does not exist" containerID="1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.754443 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd"} err="failed to get container status \"1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd\": rpc error: code = NotFound desc = could not find container \"1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd\": container with ID starting with 1cc138c576e942f4c33dc7c0293a87f966834fd88caed58c6c177af8798b04cd not found: ID does not exist" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.781720 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-oauth-serving-cert\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.781809 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-service-ca\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.781876 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vh65\" (UniqueName: \"kubernetes.io/projected/52d9a038-9fbd-4306-9e4a-00901ca865dc-kube-api-access-5vh65\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.781901 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-config\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.781970 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-serving-cert\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.782008 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-oauth-config\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.782042 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-trusted-ca-bundle\") pod \"52d9a038-9fbd-4306-9e4a-00901ca865dc\" (UID: \"52d9a038-9fbd-4306-9e4a-00901ca865dc\") " Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.783058 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.783079 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-config" (OuterVolumeSpecName: "console-config") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.783502 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.783608 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-service-ca" (OuterVolumeSpecName: "service-ca") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.789800 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.790465 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.790485 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d9a038-9fbd-4306-9e4a-00901ca865dc-kube-api-access-5vh65" (OuterVolumeSpecName: "kube-api-access-5vh65") pod "52d9a038-9fbd-4306-9e4a-00901ca865dc" (UID: "52d9a038-9fbd-4306-9e4a-00901ca865dc"). InnerVolumeSpecName "kube-api-access-5vh65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883480 4776 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883527 4776 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883555 4776 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883566 4776 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883578 4776 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883585 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vh65\" (UniqueName: \"kubernetes.io/projected/52d9a038-9fbd-4306-9e4a-00901ca865dc-kube-api-access-5vh65\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:02 crc kubenswrapper[4776]: I1204 09:54:02.883598 4776 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52d9a038-9fbd-4306-9e4a-00901ca865dc-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:03 crc kubenswrapper[4776]: I1204 09:54:03.053890 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vm645"] Dec 04 09:54:03 crc kubenswrapper[4776]: I1204 09:54:03.058503 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vm645"] Dec 04 09:54:03 crc kubenswrapper[4776]: I1204 09:54:03.460644 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" path="/var/lib/kubelet/pods/52d9a038-9fbd-4306-9e4a-00901ca865dc/volumes" Dec 04 09:54:03 crc kubenswrapper[4776]: I1204 09:54:03.735558 4776 generic.go:334] "Generic (PLEG): container finished" podID="99270406-e115-46c3-aecb-7155ec24ab04" containerID="c50ed2e62bd44ad66182ad26c1827ca938e0a282a2fffb37a40fd699ca9f735e" exitCode=0 Dec 04 09:54:03 crc kubenswrapper[4776]: I1204 09:54:03.735620 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" event={"ID":"99270406-e115-46c3-aecb-7155ec24ab04","Type":"ContainerDied","Data":"c50ed2e62bd44ad66182ad26c1827ca938e0a282a2fffb37a40fd699ca9f735e"} Dec 04 09:54:04 crc kubenswrapper[4776]: I1204 09:54:04.746390 4776 generic.go:334] "Generic (PLEG): container finished" podID="99270406-e115-46c3-aecb-7155ec24ab04" containerID="aad41a1b14bde031e195dad0d4d9a798d163dbc8d68c24596abddf7c58d62452" exitCode=0 Dec 04 09:54:04 crc kubenswrapper[4776]: I1204 09:54:04.746572 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" event={"ID":"99270406-e115-46c3-aecb-7155ec24ab04","Type":"ContainerDied","Data":"aad41a1b14bde031e195dad0d4d9a798d163dbc8d68c24596abddf7c58d62452"} Dec 04 09:54:05 crc kubenswrapper[4776]: I1204 09:54:05.988783 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.036112 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-util\") pod \"99270406-e115-46c3-aecb-7155ec24ab04\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.037187 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68wz8\" (UniqueName: \"kubernetes.io/projected/99270406-e115-46c3-aecb-7155ec24ab04-kube-api-access-68wz8\") pod \"99270406-e115-46c3-aecb-7155ec24ab04\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.037392 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-bundle\") pod \"99270406-e115-46c3-aecb-7155ec24ab04\" (UID: \"99270406-e115-46c3-aecb-7155ec24ab04\") " Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.039085 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-bundle" (OuterVolumeSpecName: "bundle") pod "99270406-e115-46c3-aecb-7155ec24ab04" (UID: "99270406-e115-46c3-aecb-7155ec24ab04"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.044987 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99270406-e115-46c3-aecb-7155ec24ab04-kube-api-access-68wz8" (OuterVolumeSpecName: "kube-api-access-68wz8") pod "99270406-e115-46c3-aecb-7155ec24ab04" (UID: "99270406-e115-46c3-aecb-7155ec24ab04"). InnerVolumeSpecName "kube-api-access-68wz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.049669 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-util" (OuterVolumeSpecName: "util") pod "99270406-e115-46c3-aecb-7155ec24ab04" (UID: "99270406-e115-46c3-aecb-7155ec24ab04"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.139239 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.139290 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99270406-e115-46c3-aecb-7155ec24ab04-util\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.139302 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68wz8\" (UniqueName: \"kubernetes.io/projected/99270406-e115-46c3-aecb-7155ec24ab04-kube-api-access-68wz8\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.779141 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" event={"ID":"99270406-e115-46c3-aecb-7155ec24ab04","Type":"ContainerDied","Data":"75aa0ed0abbed7f1e90fd1cf6d24071f626f4228aa9520ce239403b485c9bb08"} Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.779737 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75aa0ed0abbed7f1e90fd1cf6d24071f626f4228aa9520ce239403b485c9bb08" Dec 04 09:54:06 crc kubenswrapper[4776]: I1204 09:54:06.779244 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219077 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr"] Dec 04 09:54:16 crc kubenswrapper[4776]: E1204 09:54:16.219737 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="util" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219748 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="util" Dec 04 09:54:16 crc kubenswrapper[4776]: E1204 09:54:16.219758 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="extract" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219764 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="extract" Dec 04 09:54:16 crc kubenswrapper[4776]: E1204 09:54:16.219772 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerName="console" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219778 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerName="console" Dec 04 09:54:16 crc kubenswrapper[4776]: E1204 09:54:16.219793 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="pull" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219799 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="pull" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219906 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d9a038-9fbd-4306-9e4a-00901ca865dc" containerName="console" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.219937 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="99270406-e115-46c3-aecb-7155ec24ab04" containerName="extract" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.220334 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.224894 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.225078 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.225208 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.229192 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5s4zs" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.229281 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.240443 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr"] Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.272798 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0887beaf-a370-4268-9011-8278551d91bd-apiservice-cert\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.272870 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8v8\" (UniqueName: \"kubernetes.io/projected/0887beaf-a370-4268-9011-8278551d91bd-kube-api-access-4w8v8\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.273130 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0887beaf-a370-4268-9011-8278551d91bd-webhook-cert\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.374812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0887beaf-a370-4268-9011-8278551d91bd-apiservice-cert\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.374885 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w8v8\" (UniqueName: \"kubernetes.io/projected/0887beaf-a370-4268-9011-8278551d91bd-kube-api-access-4w8v8\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.374983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0887beaf-a370-4268-9011-8278551d91bd-webhook-cert\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.383043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0887beaf-a370-4268-9011-8278551d91bd-apiservice-cert\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.383649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0887beaf-a370-4268-9011-8278551d91bd-webhook-cert\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.394503 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w8v8\" (UniqueName: \"kubernetes.io/projected/0887beaf-a370-4268-9011-8278551d91bd-kube-api-access-4w8v8\") pod \"metallb-operator-controller-manager-855d6cf46f-579qr\" (UID: \"0887beaf-a370-4268-9011-8278551d91bd\") " pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.535496 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.539256 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n"] Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.539980 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.545268 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.545375 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.545282 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-ml4qx" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.551293 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n"] Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.679327 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhqz5\" (UniqueName: \"kubernetes.io/projected/82bf45e3-e222-4569-bedd-5c160fa3f1d4-kube-api-access-rhqz5\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.679458 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82bf45e3-e222-4569-bedd-5c160fa3f1d4-apiservice-cert\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.679496 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82bf45e3-e222-4569-bedd-5c160fa3f1d4-webhook-cert\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.781308 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhqz5\" (UniqueName: \"kubernetes.io/projected/82bf45e3-e222-4569-bedd-5c160fa3f1d4-kube-api-access-rhqz5\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.782175 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82bf45e3-e222-4569-bedd-5c160fa3f1d4-apiservice-cert\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.782327 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82bf45e3-e222-4569-bedd-5c160fa3f1d4-webhook-cert\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.790259 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82bf45e3-e222-4569-bedd-5c160fa3f1d4-webhook-cert\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.804440 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhqz5\" (UniqueName: \"kubernetes.io/projected/82bf45e3-e222-4569-bedd-5c160fa3f1d4-kube-api-access-rhqz5\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.811416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82bf45e3-e222-4569-bedd-5c160fa3f1d4-apiservice-cert\") pod \"metallb-operator-webhook-server-769c6857f6-zvn5n\" (UID: \"82bf45e3-e222-4569-bedd-5c160fa3f1d4\") " pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.906354 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr"] Dec 04 09:54:16 crc kubenswrapper[4776]: I1204 09:54:16.914585 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:17 crc kubenswrapper[4776]: I1204 09:54:17.127351 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n"] Dec 04 09:54:17 crc kubenswrapper[4776]: W1204 09:54:17.133447 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82bf45e3_e222_4569_bedd_5c160fa3f1d4.slice/crio-7034f667dbbecad94b31cbd1c2058e83da1d691a8ffe16126af5ddf9527cfd23 WatchSource:0}: Error finding container 7034f667dbbecad94b31cbd1c2058e83da1d691a8ffe16126af5ddf9527cfd23: Status 404 returned error can't find the container with id 7034f667dbbecad94b31cbd1c2058e83da1d691a8ffe16126af5ddf9527cfd23 Dec 04 09:54:17 crc kubenswrapper[4776]: I1204 09:54:17.847334 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" event={"ID":"82bf45e3-e222-4569-bedd-5c160fa3f1d4","Type":"ContainerStarted","Data":"7034f667dbbecad94b31cbd1c2058e83da1d691a8ffe16126af5ddf9527cfd23"} Dec 04 09:54:17 crc kubenswrapper[4776]: I1204 09:54:17.849121 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" event={"ID":"0887beaf-a370-4268-9011-8278551d91bd","Type":"ContainerStarted","Data":"c6ba2992e6fbbfad02b0ab3b47624f35cc8073887b74a92f838b8a7f6ec85c67"} Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.387278 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cpw6z"] Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.388998 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.412869 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpw6z"] Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.541978 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-catalog-content\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.542046 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx994\" (UniqueName: \"kubernetes.io/projected/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-kube-api-access-lx994\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.542082 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-utilities\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.644372 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-catalog-content\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.644434 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx994\" (UniqueName: \"kubernetes.io/projected/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-kube-api-access-lx994\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.644458 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-utilities\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.644964 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-catalog-content\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.645122 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-utilities\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.684823 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx994\" (UniqueName: \"kubernetes.io/projected/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-kube-api-access-lx994\") pod \"redhat-marketplace-cpw6z\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:18 crc kubenswrapper[4776]: I1204 09:54:18.706467 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:19 crc kubenswrapper[4776]: I1204 09:54:19.002623 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpw6z"] Dec 04 09:54:19 crc kubenswrapper[4776]: I1204 09:54:19.380383 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:54:19 crc kubenswrapper[4776]: I1204 09:54:19.380472 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:54:19 crc kubenswrapper[4776]: I1204 09:54:19.868817 4776 generic.go:334] "Generic (PLEG): container finished" podID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerID="a0771f77ec46b34e53d300ef5f9e1b07460a140c59b384ca5b964d81f02f44c2" exitCode=0 Dec 04 09:54:19 crc kubenswrapper[4776]: I1204 09:54:19.868904 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpw6z" event={"ID":"49582dc6-f2b9-4c34-8dff-a86c4a5f3079","Type":"ContainerDied","Data":"a0771f77ec46b34e53d300ef5f9e1b07460a140c59b384ca5b964d81f02f44c2"} Dec 04 09:54:19 crc kubenswrapper[4776]: I1204 09:54:19.868948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpw6z" event={"ID":"49582dc6-f2b9-4c34-8dff-a86c4a5f3079","Type":"ContainerStarted","Data":"cb02a0f56e66b4b9ba68d98b8c2a8a0822b28b07b6877105a4507df8a05875d4"} Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.934759 4776 generic.go:334] "Generic (PLEG): container finished" podID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerID="a0c8369686e4463cda0a95783d1f3b0f60fe41f50ee8024ea9a48d1e3ea2d990" exitCode=0 Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.934896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpw6z" event={"ID":"49582dc6-f2b9-4c34-8dff-a86c4a5f3079","Type":"ContainerDied","Data":"a0c8369686e4463cda0a95783d1f3b0f60fe41f50ee8024ea9a48d1e3ea2d990"} Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.944883 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" event={"ID":"82bf45e3-e222-4569-bedd-5c160fa3f1d4","Type":"ContainerStarted","Data":"a33f150d44131a5621ca06fc63b5f3ada0b0eaaf281c7385fd3816c870913877"} Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.945138 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.947159 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" event={"ID":"0887beaf-a370-4268-9011-8278551d91bd","Type":"ContainerStarted","Data":"2ef8ec49099887fc922532408972b711f079e52f13275d3a78c7c7461af0bf08"} Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.948109 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:25 crc kubenswrapper[4776]: I1204 09:54:25.983285 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" podStartSLOduration=1.8128878689999999 podStartE2EDuration="9.983269518s" podCreationTimestamp="2025-12-04 09:54:16 +0000 UTC" firstStartedPulling="2025-12-04 09:54:17.1365181 +0000 UTC m=+902.002998477" lastFinishedPulling="2025-12-04 09:54:25.306899739 +0000 UTC m=+910.173380126" observedRunningTime="2025-12-04 09:54:25.983155834 +0000 UTC m=+910.849636211" watchObservedRunningTime="2025-12-04 09:54:25.983269518 +0000 UTC m=+910.849749895" Dec 04 09:54:26 crc kubenswrapper[4776]: I1204 09:54:26.004448 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" podStartSLOduration=1.621104857 podStartE2EDuration="10.004428311s" podCreationTimestamp="2025-12-04 09:54:16 +0000 UTC" firstStartedPulling="2025-12-04 09:54:16.908069642 +0000 UTC m=+901.774550019" lastFinishedPulling="2025-12-04 09:54:25.291393096 +0000 UTC m=+910.157873473" observedRunningTime="2025-12-04 09:54:26.001958332 +0000 UTC m=+910.868438709" watchObservedRunningTime="2025-12-04 09:54:26.004428311 +0000 UTC m=+910.870908688" Dec 04 09:54:26 crc kubenswrapper[4776]: I1204 09:54:26.956726 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpw6z" event={"ID":"49582dc6-f2b9-4c34-8dff-a86c4a5f3079","Type":"ContainerStarted","Data":"dc3652a9202becc18a22b92925d3eab2675839de50b3628b1ddce618e3255ae7"} Dec 04 09:54:26 crc kubenswrapper[4776]: I1204 09:54:26.977842 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cpw6z" podStartSLOduration=2.463663349 podStartE2EDuration="8.977815868s" podCreationTimestamp="2025-12-04 09:54:18 +0000 UTC" firstStartedPulling="2025-12-04 09:54:19.875659935 +0000 UTC m=+904.742140312" lastFinishedPulling="2025-12-04 09:54:26.389812464 +0000 UTC m=+911.256292831" observedRunningTime="2025-12-04 09:54:26.976254978 +0000 UTC m=+911.842735355" watchObservedRunningTime="2025-12-04 09:54:26.977815868 +0000 UTC m=+911.844296245" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.450352 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-48l2s"] Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.452266 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.464699 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48l2s"] Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.592284 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-catalog-content\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.592610 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-utilities\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.592674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mkrz\" (UniqueName: \"kubernetes.io/projected/1ca04854-33fe-40ac-a3fc-612b8a96fffb-kube-api-access-8mkrz\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.694281 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-utilities\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.694707 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mkrz\" (UniqueName: \"kubernetes.io/projected/1ca04854-33fe-40ac-a3fc-612b8a96fffb-kube-api-access-8mkrz\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.694898 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-catalog-content\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.695064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-utilities\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.695570 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-catalog-content\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.707020 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.707112 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.721160 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mkrz\" (UniqueName: \"kubernetes.io/projected/1ca04854-33fe-40ac-a3fc-612b8a96fffb-kube-api-access-8mkrz\") pod \"certified-operators-48l2s\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.777114 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:28 crc kubenswrapper[4776]: I1204 09:54:28.792587 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:29 crc kubenswrapper[4776]: I1204 09:54:29.339270 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48l2s"] Dec 04 09:54:29 crc kubenswrapper[4776]: I1204 09:54:29.976995 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerStarted","Data":"97b9cafc0d8e99a51b2262b458331f27ad4a5c5c444efdd0c300198bbb54e999"} Dec 04 09:54:30 crc kubenswrapper[4776]: I1204 09:54:30.983093 4776 generic.go:334] "Generic (PLEG): container finished" podID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerID="43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6" exitCode=0 Dec 04 09:54:30 crc kubenswrapper[4776]: I1204 09:54:30.983199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerDied","Data":"43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6"} Dec 04 09:54:32 crc kubenswrapper[4776]: I1204 09:54:32.998807 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerStarted","Data":"35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064"} Dec 04 09:54:34 crc kubenswrapper[4776]: I1204 09:54:34.008727 4776 generic.go:334] "Generic (PLEG): container finished" podID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerID="35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064" exitCode=0 Dec 04 09:54:34 crc kubenswrapper[4776]: I1204 09:54:34.008785 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerDied","Data":"35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064"} Dec 04 09:54:35 crc kubenswrapper[4776]: I1204 09:54:35.056882 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerStarted","Data":"cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc"} Dec 04 09:54:35 crc kubenswrapper[4776]: I1204 09:54:35.237760 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-48l2s" podStartSLOduration=3.741290642 podStartE2EDuration="7.237733603s" podCreationTimestamp="2025-12-04 09:54:28 +0000 UTC" firstStartedPulling="2025-12-04 09:54:30.984550766 +0000 UTC m=+915.851031143" lastFinishedPulling="2025-12-04 09:54:34.480993717 +0000 UTC m=+919.347474104" observedRunningTime="2025-12-04 09:54:35.231307659 +0000 UTC m=+920.097788046" watchObservedRunningTime="2025-12-04 09:54:35.237733603 +0000 UTC m=+920.104213980" Dec 04 09:54:36 crc kubenswrapper[4776]: I1204 09:54:36.926439 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-769c6857f6-zvn5n" Dec 04 09:54:38 crc kubenswrapper[4776]: I1204 09:54:38.778203 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:38 crc kubenswrapper[4776]: I1204 09:54:38.778594 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:38 crc kubenswrapper[4776]: I1204 09:54:38.838977 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:38 crc kubenswrapper[4776]: I1204 09:54:38.922826 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:39 crc kubenswrapper[4776]: I1204 09:54:39.125889 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.241022 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48l2s"] Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.241629 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-48l2s" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="registry-server" containerID="cri-o://cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc" gracePeriod=2 Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.451679 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpw6z"] Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.452219 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cpw6z" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="registry-server" containerID="cri-o://dc3652a9202becc18a22b92925d3eab2675839de50b3628b1ddce618e3255ae7" gracePeriod=2 Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.601749 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.753037 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-catalog-content\") pod \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.753090 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mkrz\" (UniqueName: \"kubernetes.io/projected/1ca04854-33fe-40ac-a3fc-612b8a96fffb-kube-api-access-8mkrz\") pod \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.753117 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-utilities\") pod \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\" (UID: \"1ca04854-33fe-40ac-a3fc-612b8a96fffb\") " Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.754454 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-utilities" (OuterVolumeSpecName: "utilities") pod "1ca04854-33fe-40ac-a3fc-612b8a96fffb" (UID: "1ca04854-33fe-40ac-a3fc-612b8a96fffb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.836121 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca04854-33fe-40ac-a3fc-612b8a96fffb-kube-api-access-8mkrz" (OuterVolumeSpecName: "kube-api-access-8mkrz") pod "1ca04854-33fe-40ac-a3fc-612b8a96fffb" (UID: "1ca04854-33fe-40ac-a3fc-612b8a96fffb"). InnerVolumeSpecName "kube-api-access-8mkrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.856051 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mkrz\" (UniqueName: \"kubernetes.io/projected/1ca04854-33fe-40ac-a3fc-612b8a96fffb-kube-api-access-8mkrz\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.856098 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.882065 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ca04854-33fe-40ac-a3fc-612b8a96fffb" (UID: "1ca04854-33fe-40ac-a3fc-612b8a96fffb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:54:42 crc kubenswrapper[4776]: I1204 09:54:42.957722 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca04854-33fe-40ac-a3fc-612b8a96fffb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.105384 4776 generic.go:334] "Generic (PLEG): container finished" podID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerID="cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc" exitCode=0 Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.105439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerDied","Data":"cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc"} Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.105497 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48l2s" event={"ID":"1ca04854-33fe-40ac-a3fc-612b8a96fffb","Type":"ContainerDied","Data":"97b9cafc0d8e99a51b2262b458331f27ad4a5c5c444efdd0c300198bbb54e999"} Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.105516 4776 scope.go:117] "RemoveContainer" containerID="cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.105509 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48l2s" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.119822 4776 generic.go:334] "Generic (PLEG): container finished" podID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerID="dc3652a9202becc18a22b92925d3eab2675839de50b3628b1ddce618e3255ae7" exitCode=0 Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.119887 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpw6z" event={"ID":"49582dc6-f2b9-4c34-8dff-a86c4a5f3079","Type":"ContainerDied","Data":"dc3652a9202becc18a22b92925d3eab2675839de50b3628b1ddce618e3255ae7"} Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.119989 4776 scope.go:117] "RemoveContainer" containerID="35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.154606 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48l2s"] Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.154663 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-48l2s"] Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.306986 4776 scope.go:117] "RemoveContainer" containerID="43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.317714 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.339892 4776 scope.go:117] "RemoveContainer" containerID="cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc" Dec 04 09:54:43 crc kubenswrapper[4776]: E1204 09:54:43.340270 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc\": container with ID starting with cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc not found: ID does not exist" containerID="cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.340302 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc"} err="failed to get container status \"cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc\": rpc error: code = NotFound desc = could not find container \"cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc\": container with ID starting with cc0d594ac4bb6dab11a7c7702bbdf19de9e244d8e0217ead38b70e993ae078bc not found: ID does not exist" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.340326 4776 scope.go:117] "RemoveContainer" containerID="35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064" Dec 04 09:54:43 crc kubenswrapper[4776]: E1204 09:54:43.340755 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064\": container with ID starting with 35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064 not found: ID does not exist" containerID="35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.340777 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064"} err="failed to get container status \"35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064\": rpc error: code = NotFound desc = could not find container \"35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064\": container with ID starting with 35d727ec3e136e18363a259ddd271568726d8ac55394a3c6acc13c41d2ac7064 not found: ID does not exist" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.340791 4776 scope.go:117] "RemoveContainer" containerID="43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6" Dec 04 09:54:43 crc kubenswrapper[4776]: E1204 09:54:43.341154 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6\": container with ID starting with 43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6 not found: ID does not exist" containerID="43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.341192 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6"} err="failed to get container status \"43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6\": rpc error: code = NotFound desc = could not find container \"43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6\": container with ID starting with 43ca212f6d444766d65569736d96025fb06a2b84ba70837ea26378cee3926bd6 not found: ID does not exist" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.459893 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" path="/var/lib/kubelet/pods/1ca04854-33fe-40ac-a3fc-612b8a96fffb/volumes" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.463986 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-utilities\") pod \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.464075 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-catalog-content\") pod \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.464123 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx994\" (UniqueName: \"kubernetes.io/projected/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-kube-api-access-lx994\") pod \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\" (UID: \"49582dc6-f2b9-4c34-8dff-a86c4a5f3079\") " Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.464694 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-utilities" (OuterVolumeSpecName: "utilities") pod "49582dc6-f2b9-4c34-8dff-a86c4a5f3079" (UID: "49582dc6-f2b9-4c34-8dff-a86c4a5f3079"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.467083 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-kube-api-access-lx994" (OuterVolumeSpecName: "kube-api-access-lx994") pod "49582dc6-f2b9-4c34-8dff-a86c4a5f3079" (UID: "49582dc6-f2b9-4c34-8dff-a86c4a5f3079"). InnerVolumeSpecName "kube-api-access-lx994". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.484906 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49582dc6-f2b9-4c34-8dff-a86c4a5f3079" (UID: "49582dc6-f2b9-4c34-8dff-a86c4a5f3079"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.565427 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.565462 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:43 crc kubenswrapper[4776]: I1204 09:54:43.565479 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx994\" (UniqueName: \"kubernetes.io/projected/49582dc6-f2b9-4c34-8dff-a86c4a5f3079-kube-api-access-lx994\") on node \"crc\" DevicePath \"\"" Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.128630 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cpw6z" event={"ID":"49582dc6-f2b9-4c34-8dff-a86c4a5f3079","Type":"ContainerDied","Data":"cb02a0f56e66b4b9ba68d98b8c2a8a0822b28b07b6877105a4507df8a05875d4"} Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.128997 4776 scope.go:117] "RemoveContainer" containerID="dc3652a9202becc18a22b92925d3eab2675839de50b3628b1ddce618e3255ae7" Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.128690 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cpw6z" Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.167670 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpw6z"] Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.174352 4776 scope.go:117] "RemoveContainer" containerID="a0c8369686e4463cda0a95783d1f3b0f60fe41f50ee8024ea9a48d1e3ea2d990" Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.174580 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cpw6z"] Dec 04 09:54:44 crc kubenswrapper[4776]: I1204 09:54:44.212110 4776 scope.go:117] "RemoveContainer" containerID="a0771f77ec46b34e53d300ef5f9e1b07460a140c59b384ca5b964d81f02f44c2" Dec 04 09:54:45 crc kubenswrapper[4776]: I1204 09:54:45.461605 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" path="/var/lib/kubelet/pods/49582dc6-f2b9-4c34-8dff-a86c4a5f3079/volumes" Dec 04 09:54:49 crc kubenswrapper[4776]: I1204 09:54:49.380403 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:54:49 crc kubenswrapper[4776]: I1204 09:54:49.380981 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:54:49 crc kubenswrapper[4776]: I1204 09:54:49.381085 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:54:49 crc kubenswrapper[4776]: I1204 09:54:49.382048 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dee382b67ae6de15878aafacfd524a1e7ecdaa0880997ede900fe467e79e6d0"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:54:49 crc kubenswrapper[4776]: I1204 09:54:49.382138 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://7dee382b67ae6de15878aafacfd524a1e7ecdaa0880997ede900fe467e79e6d0" gracePeriod=600 Dec 04 09:54:50 crc kubenswrapper[4776]: I1204 09:54:50.163453 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="7dee382b67ae6de15878aafacfd524a1e7ecdaa0880997ede900fe467e79e6d0" exitCode=0 Dec 04 09:54:50 crc kubenswrapper[4776]: I1204 09:54:50.163531 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"7dee382b67ae6de15878aafacfd524a1e7ecdaa0880997ede900fe467e79e6d0"} Dec 04 09:54:50 crc kubenswrapper[4776]: I1204 09:54:50.164108 4776 scope.go:117] "RemoveContainer" containerID="d12dd5f0f4d1f9a752fa1f4be19cc5b8f751ba5253eb00ebb21d382d93196690" Dec 04 09:54:51 crc kubenswrapper[4776]: I1204 09:54:51.172007 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"e01a20d48aa8f7249b057929edbda0928b81534859b7bbd6d1f1ff0ee5da05c8"} Dec 04 09:54:56 crc kubenswrapper[4776]: I1204 09:54:56.538185 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-855d6cf46f-579qr" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.345893 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xv8wt"] Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.346238 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="registry-server" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346261 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="registry-server" Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.346274 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="extract-utilities" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346287 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="extract-utilities" Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.346310 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="extract-content" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346319 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="extract-content" Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.346330 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="extract-content" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346339 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="extract-content" Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.346348 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="registry-server" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346355 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="registry-server" Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.346369 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="extract-utilities" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346376 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="extract-utilities" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346552 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="49582dc6-f2b9-4c34-8dff-a86c4a5f3079" containerName="registry-server" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.346572 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca04854-33fe-40ac-a3fc-612b8a96fffb" containerName="registry-server" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.348662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.352243 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8sr7k" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.352822 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.354604 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm"] Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.354861 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.355539 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.357392 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.366317 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm"] Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.456950 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-metrics\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457023 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcbl\" (UniqueName: \"kubernetes.io/projected/07958ee1-d044-41bd-a405-eb3d7585f036-kube-api-access-2lcbl\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457070 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvg4k\" (UniqueName: \"kubernetes.io/projected/eab1cf4a-97de-4d47-a34d-503d31d32d77-kube-api-access-zvg4k\") pod \"frr-k8s-webhook-server-7fcb986d4-rgsnm\" (UID: \"eab1cf4a-97de-4d47-a34d-503d31d32d77\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-frr-conf\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457193 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-reloader\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457217 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eab1cf4a-97de-4d47-a34d-503d31d32d77-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-rgsnm\" (UID: \"eab1cf4a-97de-4d47-a34d-503d31d32d77\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457283 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/07958ee1-d044-41bd-a405-eb3d7585f036-frr-startup\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457306 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07958ee1-d044-41bd-a405-eb3d7585f036-metrics-certs\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.457348 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-frr-sockets\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.518854 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lp9tk"] Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.519781 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.522733 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.522899 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.523586 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.546382 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hk2lw" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559576 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-frr-sockets\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559635 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-metrics\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559655 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcbl\" (UniqueName: \"kubernetes.io/projected/07958ee1-d044-41bd-a405-eb3d7585f036-kube-api-access-2lcbl\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvg4k\" (UniqueName: \"kubernetes.io/projected/eab1cf4a-97de-4d47-a34d-503d31d32d77-kube-api-access-zvg4k\") pod \"frr-k8s-webhook-server-7fcb986d4-rgsnm\" (UID: \"eab1cf4a-97de-4d47-a34d-503d31d32d77\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559736 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-frr-conf\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559757 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-reloader\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559773 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eab1cf4a-97de-4d47-a34d-503d31d32d77-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-rgsnm\" (UID: \"eab1cf4a-97de-4d47-a34d-503d31d32d77\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559816 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/07958ee1-d044-41bd-a405-eb3d7585f036-frr-startup\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.559831 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07958ee1-d044-41bd-a405-eb3d7585f036-metrics-certs\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.561055 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-nk6g7"] Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.561336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-frr-sockets\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.561559 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-reloader\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.562124 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-frr-conf\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.562598 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.562720 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/07958ee1-d044-41bd-a405-eb3d7585f036-metrics\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.563064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/07958ee1-d044-41bd-a405-eb3d7585f036-frr-startup\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.568339 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07958ee1-d044-41bd-a405-eb3d7585f036-metrics-certs\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.568844 4776 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.570880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eab1cf4a-97de-4d47-a34d-503d31d32d77-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-rgsnm\" (UID: \"eab1cf4a-97de-4d47-a34d-503d31d32d77\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.583394 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcbl\" (UniqueName: \"kubernetes.io/projected/07958ee1-d044-41bd-a405-eb3d7585f036-kube-api-access-2lcbl\") pod \"frr-k8s-xv8wt\" (UID: \"07958ee1-d044-41bd-a405-eb3d7585f036\") " pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.592286 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-nk6g7"] Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.593969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvg4k\" (UniqueName: \"kubernetes.io/projected/eab1cf4a-97de-4d47-a34d-503d31d32d77-kube-api-access-zvg4k\") pod \"frr-k8s-webhook-server-7fcb986d4-rgsnm\" (UID: \"eab1cf4a-97de-4d47-a34d-503d31d32d77\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.665890 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.666150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-metrics-certs\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.666273 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8xc6\" (UniqueName: \"kubernetes.io/projected/cac2d534-af69-46ca-ab51-5ba3b56999fe-kube-api-access-b8xc6\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.666301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-cert\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.666364 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-metrics-certs\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.666523 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ab18bfc-af5b-4be8-b481-7fdc03809bde-metallb-excludel2\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.666800 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59lj\" (UniqueName: \"kubernetes.io/projected/2ab18bfc-af5b-4be8-b481-7fdc03809bde-kube-api-access-h59lj\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.669032 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.684573 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-metrics-certs\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-cert\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8xc6\" (UniqueName: \"kubernetes.io/projected/cac2d534-af69-46ca-ab51-5ba3b56999fe-kube-api-access-b8xc6\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768134 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-metrics-certs\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768181 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ab18bfc-af5b-4be8-b481-7fdc03809bde-metallb-excludel2\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768217 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h59lj\" (UniqueName: \"kubernetes.io/projected/2ab18bfc-af5b-4be8-b481-7fdc03809bde-kube-api-access-h59lj\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.768259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.768395 4776 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.768458 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist podName:2ab18bfc-af5b-4be8-b481-7fdc03809bde nodeName:}" failed. No retries permitted until 2025-12-04 09:54:58.268441764 +0000 UTC m=+943.134922141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist") pod "speaker-lp9tk" (UID: "2ab18bfc-af5b-4be8-b481-7fdc03809bde") : secret "metallb-memberlist" not found Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.768730 4776 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 04 09:54:57 crc kubenswrapper[4776]: E1204 09:54:57.768779 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-metrics-certs podName:cac2d534-af69-46ca-ab51-5ba3b56999fe nodeName:}" failed. No retries permitted until 2025-12-04 09:54:58.268753834 +0000 UTC m=+943.135234211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-metrics-certs") pod "controller-f8648f98b-nk6g7" (UID: "cac2d534-af69-46ca-ab51-5ba3b56999fe") : secret "controller-certs-secret" not found Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.770031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ab18bfc-af5b-4be8-b481-7fdc03809bde-metallb-excludel2\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.772011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-cert\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.772291 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-metrics-certs\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.788552 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59lj\" (UniqueName: \"kubernetes.io/projected/2ab18bfc-af5b-4be8-b481-7fdc03809bde-kube-api-access-h59lj\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.791185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8xc6\" (UniqueName: \"kubernetes.io/projected/cac2d534-af69-46ca-ab51-5ba3b56999fe-kube-api-access-b8xc6\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:57 crc kubenswrapper[4776]: I1204 09:54:57.975810 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm"] Dec 04 09:54:57 crc kubenswrapper[4776]: W1204 09:54:57.978974 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab1cf4a_97de_4d47_a34d_503d31d32d77.slice/crio-b10daec12f7b670a7f25cce2f5a020490482eec4e3ffc9d277632f1ab8243e4c WatchSource:0}: Error finding container b10daec12f7b670a7f25cce2f5a020490482eec4e3ffc9d277632f1ab8243e4c: Status 404 returned error can't find the container with id b10daec12f7b670a7f25cce2f5a020490482eec4e3ffc9d277632f1ab8243e4c Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.221188 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" event={"ID":"eab1cf4a-97de-4d47-a34d-503d31d32d77","Type":"ContainerStarted","Data":"b10daec12f7b670a7f25cce2f5a020490482eec4e3ffc9d277632f1ab8243e4c"} Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.222879 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"4f8677387e61103bd9d158724e71e951cf77ad02bba2dcced751eb6873b5bd52"} Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.276213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.276295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-metrics-certs\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:58 crc kubenswrapper[4776]: E1204 09:54:58.276527 4776 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 09:54:58 crc kubenswrapper[4776]: E1204 09:54:58.276702 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist podName:2ab18bfc-af5b-4be8-b481-7fdc03809bde nodeName:}" failed. No retries permitted until 2025-12-04 09:54:59.276667278 +0000 UTC m=+944.143147655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist") pod "speaker-lp9tk" (UID: "2ab18bfc-af5b-4be8-b481-7fdc03809bde") : secret "metallb-memberlist" not found Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.282299 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cac2d534-af69-46ca-ab51-5ba3b56999fe-metrics-certs\") pod \"controller-f8648f98b-nk6g7\" (UID: \"cac2d534-af69-46ca-ab51-5ba3b56999fe\") " pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.545125 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:58 crc kubenswrapper[4776]: I1204 09:54:58.775276 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-nk6g7"] Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.234186 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-nk6g7" event={"ID":"cac2d534-af69-46ca-ab51-5ba3b56999fe","Type":"ContainerStarted","Data":"1531476176e044af0b7cb9b53122edc59c2968c52c231f88befd49fccfdaf059"} Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.234687 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.234703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-nk6g7" event={"ID":"cac2d534-af69-46ca-ab51-5ba3b56999fe","Type":"ContainerStarted","Data":"2192040cc76b73aaf4464b0358e01a3b2d502e49e53e716f8d1155249fc1846f"} Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.234716 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-nk6g7" event={"ID":"cac2d534-af69-46ca-ab51-5ba3b56999fe","Type":"ContainerStarted","Data":"c9b13d84793193de060bcf27a5f544bdc9109c61846c48866616b7bc952733ac"} Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.260155 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-nk6g7" podStartSLOduration=2.260135064 podStartE2EDuration="2.260135064s" podCreationTimestamp="2025-12-04 09:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:54:59.25650481 +0000 UTC m=+944.122985207" watchObservedRunningTime="2025-12-04 09:54:59.260135064 +0000 UTC m=+944.126615441" Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.292881 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.308867 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ab18bfc-af5b-4be8-b481-7fdc03809bde-memberlist\") pod \"speaker-lp9tk\" (UID: \"2ab18bfc-af5b-4be8-b481-7fdc03809bde\") " pod="metallb-system/speaker-lp9tk" Dec 04 09:54:59 crc kubenswrapper[4776]: I1204 09:54:59.341736 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lp9tk" Dec 04 09:54:59 crc kubenswrapper[4776]: W1204 09:54:59.368964 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab18bfc_af5b_4be8_b481_7fdc03809bde.slice/crio-26f39085d67ff50ed7281b9ab89b09c282f1c1050d2660810359c85f2aeadcad WatchSource:0}: Error finding container 26f39085d67ff50ed7281b9ab89b09c282f1c1050d2660810359c85f2aeadcad: Status 404 returned error can't find the container with id 26f39085d67ff50ed7281b9ab89b09c282f1c1050d2660810359c85f2aeadcad Dec 04 09:55:00 crc kubenswrapper[4776]: I1204 09:55:00.252033 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lp9tk" event={"ID":"2ab18bfc-af5b-4be8-b481-7fdc03809bde","Type":"ContainerStarted","Data":"4b3dc516c5f89b727737e456f2f765be025a80da6a53e71ffccb2ab81ac96b78"} Dec 04 09:55:00 crc kubenswrapper[4776]: I1204 09:55:00.252358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lp9tk" event={"ID":"2ab18bfc-af5b-4be8-b481-7fdc03809bde","Type":"ContainerStarted","Data":"33425c96d3c1e34b40d575010cb000fb77b8e0e1f674b9e24ff69570e8ae9237"} Dec 04 09:55:00 crc kubenswrapper[4776]: I1204 09:55:00.252370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lp9tk" event={"ID":"2ab18bfc-af5b-4be8-b481-7fdc03809bde","Type":"ContainerStarted","Data":"26f39085d67ff50ed7281b9ab89b09c282f1c1050d2660810359c85f2aeadcad"} Dec 04 09:55:00 crc kubenswrapper[4776]: I1204 09:55:00.252554 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lp9tk" Dec 04 09:55:00 crc kubenswrapper[4776]: I1204 09:55:00.278958 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lp9tk" podStartSLOduration=3.278937948 podStartE2EDuration="3.278937948s" podCreationTimestamp="2025-12-04 09:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:55:00.273796983 +0000 UTC m=+945.140277360" watchObservedRunningTime="2025-12-04 09:55:00.278937948 +0000 UTC m=+945.145418325" Dec 04 09:55:08 crc kubenswrapper[4776]: I1204 09:55:08.455373 4776 generic.go:334] "Generic (PLEG): container finished" podID="07958ee1-d044-41bd-a405-eb3d7585f036" containerID="e6842cac3eed9b10be2a56d7d4b64f590acc1b9e4d25e42da90937874f8efe15" exitCode=0 Dec 04 09:55:08 crc kubenswrapper[4776]: I1204 09:55:08.455637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerDied","Data":"e6842cac3eed9b10be2a56d7d4b64f590acc1b9e4d25e42da90937874f8efe15"} Dec 04 09:55:08 crc kubenswrapper[4776]: I1204 09:55:08.459085 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" event={"ID":"eab1cf4a-97de-4d47-a34d-503d31d32d77","Type":"ContainerStarted","Data":"d0d2390f7becf4c22d686a8ee4cf6061291edfdb832f9b2d3f9962289a530004"} Dec 04 09:55:08 crc kubenswrapper[4776]: I1204 09:55:08.459268 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:55:08 crc kubenswrapper[4776]: I1204 09:55:08.517857 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" podStartSLOduration=2.217515979 podStartE2EDuration="11.517826904s" podCreationTimestamp="2025-12-04 09:54:57 +0000 UTC" firstStartedPulling="2025-12-04 09:54:57.981321523 +0000 UTC m=+942.847801900" lastFinishedPulling="2025-12-04 09:55:07.281632448 +0000 UTC m=+952.148112825" observedRunningTime="2025-12-04 09:55:08.513870107 +0000 UTC m=+953.380350484" watchObservedRunningTime="2025-12-04 09:55:08.517826904 +0000 UTC m=+953.384307281" Dec 04 09:55:09 crc kubenswrapper[4776]: I1204 09:55:09.346951 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lp9tk" Dec 04 09:55:09 crc kubenswrapper[4776]: I1204 09:55:09.469290 4776 generic.go:334] "Generic (PLEG): container finished" podID="07958ee1-d044-41bd-a405-eb3d7585f036" containerID="068865cb4381377907dee81705f32c713ca45f915b7f20bc015ecac1be5444c0" exitCode=0 Dec 04 09:55:09 crc kubenswrapper[4776]: I1204 09:55:09.469458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerDied","Data":"068865cb4381377907dee81705f32c713ca45f915b7f20bc015ecac1be5444c0"} Dec 04 09:55:10 crc kubenswrapper[4776]: I1204 09:55:10.478746 4776 generic.go:334] "Generic (PLEG): container finished" podID="07958ee1-d044-41bd-a405-eb3d7585f036" containerID="1113a1fcaae5b69addfa645f2a3b637868509376eb491e74ebbe56151a93dfe2" exitCode=0 Dec 04 09:55:10 crc kubenswrapper[4776]: I1204 09:55:10.478805 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerDied","Data":"1113a1fcaae5b69addfa645f2a3b637868509376eb491e74ebbe56151a93dfe2"} Dec 04 09:55:11 crc kubenswrapper[4776]: I1204 09:55:11.490509 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"f0c6283967dfa2770bd89a245d7c0cb63b064f827a780c8815faf942a436df95"} Dec 04 09:55:11 crc kubenswrapper[4776]: I1204 09:55:11.491029 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"b40393cf908ace0fbfd7996085b00b4cbc10d759a8b2071cc4b09d6146f8b46f"} Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.509802 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"6d0fc9e77ed6c2d4ff71b21ad79b3ffda75d5215a188b52d951b4d69fc4adb58"} Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.510344 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"66579e745c9bd055d32f541bfd55d4bf3ad2d8713422b23509662bf7f13aa275"} Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.510361 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"e959923110f5f56f817d77ae6fe7c63a10e9ff249ebc5f4cadcb68ea84884868"} Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.510375 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xv8wt" event={"ID":"07958ee1-d044-41bd-a405-eb3d7585f036","Type":"ContainerStarted","Data":"820205ec653436df5914544a4e8b9e87720ad2222d3210954ecf343efb9914f8"} Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.541388 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xv8wt" podStartSLOduration=6.112916181 podStartE2EDuration="15.541362057s" podCreationTimestamp="2025-12-04 09:54:57 +0000 UTC" firstStartedPulling="2025-12-04 09:54:57.814986056 +0000 UTC m=+942.681466423" lastFinishedPulling="2025-12-04 09:55:07.243431922 +0000 UTC m=+952.109912299" observedRunningTime="2025-12-04 09:55:12.537093771 +0000 UTC m=+957.403574158" watchObservedRunningTime="2025-12-04 09:55:12.541362057 +0000 UTC m=+957.407842434" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.670676 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.709135 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rlkd4"] Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.710152 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.713631 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.714284 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-97lwn" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.715601 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.727497 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rlkd4"] Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.748408 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.813745 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjnq\" (UniqueName: \"kubernetes.io/projected/cdb8127a-f785-4eff-a02c-997023735325-kube-api-access-6xjnq\") pod \"openstack-operator-index-rlkd4\" (UID: \"cdb8127a-f785-4eff-a02c-997023735325\") " pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.915717 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjnq\" (UniqueName: \"kubernetes.io/projected/cdb8127a-f785-4eff-a02c-997023735325-kube-api-access-6xjnq\") pod \"openstack-operator-index-rlkd4\" (UID: \"cdb8127a-f785-4eff-a02c-997023735325\") " pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:12 crc kubenswrapper[4776]: I1204 09:55:12.939055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjnq\" (UniqueName: \"kubernetes.io/projected/cdb8127a-f785-4eff-a02c-997023735325-kube-api-access-6xjnq\") pod \"openstack-operator-index-rlkd4\" (UID: \"cdb8127a-f785-4eff-a02c-997023735325\") " pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:13 crc kubenswrapper[4776]: I1204 09:55:13.037054 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:13 crc kubenswrapper[4776]: I1204 09:55:13.516253 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:55:13 crc kubenswrapper[4776]: I1204 09:55:13.522547 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rlkd4"] Dec 04 09:55:13 crc kubenswrapper[4776]: W1204 09:55:13.532089 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb8127a_f785_4eff_a02c_997023735325.slice/crio-5bce421fbc0b443720d01a2c670a70465dd5aa73d2ea540d9ae11571637e8921 WatchSource:0}: Error finding container 5bce421fbc0b443720d01a2c670a70465dd5aa73d2ea540d9ae11571637e8921: Status 404 returned error can't find the container with id 5bce421fbc0b443720d01a2c670a70465dd5aa73d2ea540d9ae11571637e8921 Dec 04 09:55:14 crc kubenswrapper[4776]: I1204 09:55:14.530647 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rlkd4" event={"ID":"cdb8127a-f785-4eff-a02c-997023735325","Type":"ContainerStarted","Data":"5bce421fbc0b443720d01a2c670a70465dd5aa73d2ea540d9ae11571637e8921"} Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.090111 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rlkd4"] Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.548259 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rlkd4" event={"ID":"cdb8127a-f785-4eff-a02c-997023735325","Type":"ContainerStarted","Data":"9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d"} Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.548610 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rlkd4" podUID="cdb8127a-f785-4eff-a02c-997023735325" containerName="registry-server" containerID="cri-o://9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d" gracePeriod=2 Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.693222 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rlkd4" podStartSLOduration=2.020458108 podStartE2EDuration="4.692943559s" podCreationTimestamp="2025-12-04 09:55:12 +0000 UTC" firstStartedPulling="2025-12-04 09:55:13.533747908 +0000 UTC m=+958.400228285" lastFinishedPulling="2025-12-04 09:55:16.206233359 +0000 UTC m=+961.072713736" observedRunningTime="2025-12-04 09:55:16.5657921 +0000 UTC m=+961.432272487" watchObservedRunningTime="2025-12-04 09:55:16.692943559 +0000 UTC m=+961.559423956" Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.698687 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qr9vs"] Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.701189 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.703073 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qr9vs"] Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.798413 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfpv7\" (UniqueName: \"kubernetes.io/projected/f6f9de95-98b9-47ca-b4a0-c5a99ca9a610-kube-api-access-xfpv7\") pod \"openstack-operator-index-qr9vs\" (UID: \"f6f9de95-98b9-47ca-b4a0-c5a99ca9a610\") " pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.900406 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfpv7\" (UniqueName: \"kubernetes.io/projected/f6f9de95-98b9-47ca-b4a0-c5a99ca9a610-kube-api-access-xfpv7\") pod \"openstack-operator-index-qr9vs\" (UID: \"f6f9de95-98b9-47ca-b4a0-c5a99ca9a610\") " pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.924306 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfpv7\" (UniqueName: \"kubernetes.io/projected/f6f9de95-98b9-47ca-b4a0-c5a99ca9a610-kube-api-access-xfpv7\") pod \"openstack-operator-index-qr9vs\" (UID: \"f6f9de95-98b9-47ca-b4a0-c5a99ca9a610\") " pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:16 crc kubenswrapper[4776]: I1204 09:55:16.965453 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.000941 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xjnq\" (UniqueName: \"kubernetes.io/projected/cdb8127a-f785-4eff-a02c-997023735325-kube-api-access-6xjnq\") pod \"cdb8127a-f785-4eff-a02c-997023735325\" (UID: \"cdb8127a-f785-4eff-a02c-997023735325\") " Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.005649 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb8127a-f785-4eff-a02c-997023735325-kube-api-access-6xjnq" (OuterVolumeSpecName: "kube-api-access-6xjnq") pod "cdb8127a-f785-4eff-a02c-997023735325" (UID: "cdb8127a-f785-4eff-a02c-997023735325"). InnerVolumeSpecName "kube-api-access-6xjnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.028585 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.102361 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xjnq\" (UniqueName: \"kubernetes.io/projected/cdb8127a-f785-4eff-a02c-997023735325-kube-api-access-6xjnq\") on node \"crc\" DevicePath \"\"" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.450673 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qr9vs"] Dec 04 09:55:17 crc kubenswrapper[4776]: W1204 09:55:17.459489 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f9de95_98b9_47ca_b4a0_c5a99ca9a610.slice/crio-ff2f5aa06a298c27d800756faacb577c59b508f5a67406484690b6905be819aa WatchSource:0}: Error finding container ff2f5aa06a298c27d800756faacb577c59b508f5a67406484690b6905be819aa: Status 404 returned error can't find the container with id ff2f5aa06a298c27d800756faacb577c59b508f5a67406484690b6905be819aa Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.556121 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qr9vs" event={"ID":"f6f9de95-98b9-47ca-b4a0-c5a99ca9a610","Type":"ContainerStarted","Data":"ff2f5aa06a298c27d800756faacb577c59b508f5a67406484690b6905be819aa"} Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.558503 4776 generic.go:334] "Generic (PLEG): container finished" podID="cdb8127a-f785-4eff-a02c-997023735325" containerID="9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d" exitCode=0 Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.558580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rlkd4" event={"ID":"cdb8127a-f785-4eff-a02c-997023735325","Type":"ContainerDied","Data":"9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d"} Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.558665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rlkd4" event={"ID":"cdb8127a-f785-4eff-a02c-997023735325","Type":"ContainerDied","Data":"5bce421fbc0b443720d01a2c670a70465dd5aa73d2ea540d9ae11571637e8921"} Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.558695 4776 scope.go:117] "RemoveContainer" containerID="9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.558902 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rlkd4" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.592330 4776 scope.go:117] "RemoveContainer" containerID="9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d" Dec 04 09:55:17 crc kubenswrapper[4776]: E1204 09:55:17.594078 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d\": container with ID starting with 9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d not found: ID does not exist" containerID="9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.594153 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d"} err="failed to get container status \"9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d\": rpc error: code = NotFound desc = could not find container \"9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d\": container with ID starting with 9312ad2d681cad6dac94dbc9d1434d95aa33c8880c31f039e60f6cd9aa14a07d not found: ID does not exist" Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.622243 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rlkd4"] Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.634415 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rlkd4"] Dec 04 09:55:17 crc kubenswrapper[4776]: I1204 09:55:17.690891 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rgsnm" Dec 04 09:55:18 crc kubenswrapper[4776]: I1204 09:55:18.549967 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-nk6g7" Dec 04 09:55:18 crc kubenswrapper[4776]: I1204 09:55:18.566798 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qr9vs" event={"ID":"f6f9de95-98b9-47ca-b4a0-c5a99ca9a610","Type":"ContainerStarted","Data":"9e4409acd69c6efea87ef2652e4d8196429cbb8715769a78038f6238d3dc5b0a"} Dec 04 09:55:18 crc kubenswrapper[4776]: I1204 09:55:18.591799 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qr9vs" podStartSLOduration=2.5357826709999998 podStartE2EDuration="2.591767984s" podCreationTimestamp="2025-12-04 09:55:16 +0000 UTC" firstStartedPulling="2025-12-04 09:55:17.465104067 +0000 UTC m=+962.331584444" lastFinishedPulling="2025-12-04 09:55:17.52108938 +0000 UTC m=+962.387569757" observedRunningTime="2025-12-04 09:55:18.588395606 +0000 UTC m=+963.454875993" watchObservedRunningTime="2025-12-04 09:55:18.591767984 +0000 UTC m=+963.458248381" Dec 04 09:55:19 crc kubenswrapper[4776]: I1204 09:55:19.460538 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb8127a-f785-4eff-a02c-997023735325" path="/var/lib/kubelet/pods/cdb8127a-f785-4eff-a02c-997023735325/volumes" Dec 04 09:55:27 crc kubenswrapper[4776]: I1204 09:55:27.029501 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:27 crc kubenswrapper[4776]: I1204 09:55:27.032577 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:27 crc kubenswrapper[4776]: I1204 09:55:27.060424 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:27 crc kubenswrapper[4776]: I1204 09:55:27.672858 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qr9vs" Dec 04 09:55:27 crc kubenswrapper[4776]: I1204 09:55:27.677644 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xv8wt" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.552348 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz"] Dec 04 09:55:28 crc kubenswrapper[4776]: E1204 09:55:28.552890 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb8127a-f785-4eff-a02c-997023735325" containerName="registry-server" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.552905 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb8127a-f785-4eff-a02c-997023735325" containerName="registry-server" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.553071 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb8127a-f785-4eff-a02c-997023735325" containerName="registry-server" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.553924 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.556790 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tm4th" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.567433 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz"] Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.578386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-bundle\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.578449 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdv2\" (UniqueName: \"kubernetes.io/projected/14389a80-35f5-46c7-9689-acaa3fd5310d-kube-api-access-jrdv2\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.578490 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-util\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.679274 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-util\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.679380 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-bundle\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.679428 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdv2\" (UniqueName: \"kubernetes.io/projected/14389a80-35f5-46c7-9689-acaa3fd5310d-kube-api-access-jrdv2\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.680186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-util\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.680253 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-bundle\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.706952 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdv2\" (UniqueName: \"kubernetes.io/projected/14389a80-35f5-46c7-9689-acaa3fd5310d-kube-api-access-jrdv2\") pod \"f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:28 crc kubenswrapper[4776]: I1204 09:55:28.877722 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:29 crc kubenswrapper[4776]: I1204 09:55:29.669857 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz"] Dec 04 09:55:30 crc kubenswrapper[4776]: I1204 09:55:30.661765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" event={"ID":"14389a80-35f5-46c7-9689-acaa3fd5310d","Type":"ContainerStarted","Data":"8de1574c3bde0644a036740aa89f467fa6e02b4791498981b0ca7f7d879af1be"} Dec 04 09:55:34 crc kubenswrapper[4776]: I1204 09:55:34.691222 4776 generic.go:334] "Generic (PLEG): container finished" podID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerID="b9975ec7d32e21a0999061722ea1bf15a613b9bc1eeabf51723e18ac98e5f991" exitCode=0 Dec 04 09:55:34 crc kubenswrapper[4776]: I1204 09:55:34.691288 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" event={"ID":"14389a80-35f5-46c7-9689-acaa3fd5310d","Type":"ContainerDied","Data":"b9975ec7d32e21a0999061722ea1bf15a613b9bc1eeabf51723e18ac98e5f991"} Dec 04 09:55:37 crc kubenswrapper[4776]: I1204 09:55:37.712272 4776 generic.go:334] "Generic (PLEG): container finished" podID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerID="056eb5cb56403eda6e02a001508ece6c4812da9c14dbdcf130abf37f4a538a1d" exitCode=0 Dec 04 09:55:37 crc kubenswrapper[4776]: I1204 09:55:37.712350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" event={"ID":"14389a80-35f5-46c7-9689-acaa3fd5310d","Type":"ContainerDied","Data":"056eb5cb56403eda6e02a001508ece6c4812da9c14dbdcf130abf37f4a538a1d"} Dec 04 09:55:38 crc kubenswrapper[4776]: I1204 09:55:38.720782 4776 generic.go:334] "Generic (PLEG): container finished" podID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerID="7afd87bdf74e1fec3f7bbf8baebbcd14feb965b068e63dcb38f0732fe0c84f3a" exitCode=0 Dec 04 09:55:38 crc kubenswrapper[4776]: I1204 09:55:38.720847 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" event={"ID":"14389a80-35f5-46c7-9689-acaa3fd5310d","Type":"ContainerDied","Data":"7afd87bdf74e1fec3f7bbf8baebbcd14feb965b068e63dcb38f0732fe0c84f3a"} Dec 04 09:55:39 crc kubenswrapper[4776]: I1204 09:55:39.987678 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.131740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-bundle\") pod \"14389a80-35f5-46c7-9689-acaa3fd5310d\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.132145 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrdv2\" (UniqueName: \"kubernetes.io/projected/14389a80-35f5-46c7-9689-acaa3fd5310d-kube-api-access-jrdv2\") pod \"14389a80-35f5-46c7-9689-acaa3fd5310d\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.132287 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-util\") pod \"14389a80-35f5-46c7-9689-acaa3fd5310d\" (UID: \"14389a80-35f5-46c7-9689-acaa3fd5310d\") " Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.132851 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-bundle" (OuterVolumeSpecName: "bundle") pod "14389a80-35f5-46c7-9689-acaa3fd5310d" (UID: "14389a80-35f5-46c7-9689-acaa3fd5310d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.133360 4776 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.138746 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14389a80-35f5-46c7-9689-acaa3fd5310d-kube-api-access-jrdv2" (OuterVolumeSpecName: "kube-api-access-jrdv2") pod "14389a80-35f5-46c7-9689-acaa3fd5310d" (UID: "14389a80-35f5-46c7-9689-acaa3fd5310d"). InnerVolumeSpecName "kube-api-access-jrdv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.145169 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-util" (OuterVolumeSpecName: "util") pod "14389a80-35f5-46c7-9689-acaa3fd5310d" (UID: "14389a80-35f5-46c7-9689-acaa3fd5310d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.234875 4776 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14389a80-35f5-46c7-9689-acaa3fd5310d-util\") on node \"crc\" DevicePath \"\"" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.235098 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrdv2\" (UniqueName: \"kubernetes.io/projected/14389a80-35f5-46c7-9689-acaa3fd5310d-kube-api-access-jrdv2\") on node \"crc\" DevicePath \"\"" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.735818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" event={"ID":"14389a80-35f5-46c7-9689-acaa3fd5310d","Type":"ContainerDied","Data":"8de1574c3bde0644a036740aa89f467fa6e02b4791498981b0ca7f7d879af1be"} Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.735880 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8de1574c3bde0644a036740aa89f467fa6e02b4791498981b0ca7f7d879af1be" Dec 04 09:55:40 crc kubenswrapper[4776]: I1204 09:55:40.736243 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.150355 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s"] Dec 04 09:55:46 crc kubenswrapper[4776]: E1204 09:55:46.151229 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="pull" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.151245 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="pull" Dec 04 09:55:46 crc kubenswrapper[4776]: E1204 09:55:46.151257 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="extract" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.151264 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="extract" Dec 04 09:55:46 crc kubenswrapper[4776]: E1204 09:55:46.151285 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="util" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.151291 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="util" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.151397 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="14389a80-35f5-46c7-9689-acaa3fd5310d" containerName="extract" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.151827 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.155312 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-x8m9j" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.176242 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s"] Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.314062 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zp5\" (UniqueName: \"kubernetes.io/projected/730ff180-62d9-4a70-b200-e2ac3ea2b4c8-kube-api-access-66zp5\") pod \"openstack-operator-controller-operator-6db4dd56f6-5962s\" (UID: \"730ff180-62d9-4a70-b200-e2ac3ea2b4c8\") " pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.414957 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zp5\" (UniqueName: \"kubernetes.io/projected/730ff180-62d9-4a70-b200-e2ac3ea2b4c8-kube-api-access-66zp5\") pod \"openstack-operator-controller-operator-6db4dd56f6-5962s\" (UID: \"730ff180-62d9-4a70-b200-e2ac3ea2b4c8\") " pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.439901 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zp5\" (UniqueName: \"kubernetes.io/projected/730ff180-62d9-4a70-b200-e2ac3ea2b4c8-kube-api-access-66zp5\") pod \"openstack-operator-controller-operator-6db4dd56f6-5962s\" (UID: \"730ff180-62d9-4a70-b200-e2ac3ea2b4c8\") " pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.473313 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.754850 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s"] Dec 04 09:55:46 crc kubenswrapper[4776]: I1204 09:55:46.782611 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" event={"ID":"730ff180-62d9-4a70-b200-e2ac3ea2b4c8","Type":"ContainerStarted","Data":"675b397615bc3e09df98bd14ebe5d62e9d35cf1b6677b448f8448e97aea09b1a"} Dec 04 09:55:52 crc kubenswrapper[4776]: I1204 09:55:52.834403 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" event={"ID":"730ff180-62d9-4a70-b200-e2ac3ea2b4c8","Type":"ContainerStarted","Data":"2c2527790f6589241cf911e4af4b377b309d90cac0af8796eff5f78c5a504b76"} Dec 04 09:55:52 crc kubenswrapper[4776]: I1204 09:55:52.835002 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:55:52 crc kubenswrapper[4776]: I1204 09:55:52.867117 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" podStartSLOduration=1.259700726 podStartE2EDuration="6.867099012s" podCreationTimestamp="2025-12-04 09:55:46 +0000 UTC" firstStartedPulling="2025-12-04 09:55:46.766474945 +0000 UTC m=+991.632955322" lastFinishedPulling="2025-12-04 09:55:52.373873231 +0000 UTC m=+997.240353608" observedRunningTime="2025-12-04 09:55:52.861358203 +0000 UTC m=+997.727838580" watchObservedRunningTime="2025-12-04 09:55:52.867099012 +0000 UTC m=+997.733579389" Dec 04 09:56:06 crc kubenswrapper[4776]: I1204 09:56:06.476970 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6db4dd56f6-5962s" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.281977 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.283623 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.293178 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5m9x5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.298441 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.299648 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.302945 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bkrqv" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.303404 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.325270 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.336284 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.337798 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.340389 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nhsm6" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.352124 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.353344 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.356822 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-r7kjm" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.374146 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.389295 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.396596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfl6\" (UniqueName: \"kubernetes.io/projected/ec5df28d-5944-43f3-bf28-12e1062b1060-kube-api-access-kqfl6\") pod \"barbican-operator-controller-manager-7d9dfd778-jtqt7\" (UID: \"ec5df28d-5944-43f3-bf28-12e1062b1060\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.396680 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4nb7\" (UniqueName: \"kubernetes.io/projected/61813ce8-b03b-473b-9606-22515ab1de03-kube-api-access-h4nb7\") pod \"cinder-operator-controller-manager-859b6ccc6-8h27m\" (UID: \"61813ce8-b03b-473b-9606-22515ab1de03\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.396708 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7f6\" (UniqueName: \"kubernetes.io/projected/2ceaf037-5fce-4ef5-b273-724eb446e0af-kube-api-access-pn7f6\") pod \"glance-operator-controller-manager-77987cd8cd-d7zhq\" (UID: \"2ceaf037-5fce-4ef5-b273-724eb446e0af\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.396744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjhv\" (UniqueName: \"kubernetes.io/projected/df5a8995-658c-4525-93ac-604d3c2af213-kube-api-access-zdjhv\") pod \"designate-operator-controller-manager-78b4bc895b-z8q57\" (UID: \"df5a8995-658c-4525-93ac-604d3c2af213\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.406834 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.408554 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.413545 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4m6dc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.426032 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.428296 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.433155 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-58dpc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.524416 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4nb7\" (UniqueName: \"kubernetes.io/projected/61813ce8-b03b-473b-9606-22515ab1de03-kube-api-access-h4nb7\") pod \"cinder-operator-controller-manager-859b6ccc6-8h27m\" (UID: \"61813ce8-b03b-473b-9606-22515ab1de03\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.524477 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7f6\" (UniqueName: \"kubernetes.io/projected/2ceaf037-5fce-4ef5-b273-724eb446e0af-kube-api-access-pn7f6\") pod \"glance-operator-controller-manager-77987cd8cd-d7zhq\" (UID: \"2ceaf037-5fce-4ef5-b273-724eb446e0af\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.524518 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjhv\" (UniqueName: \"kubernetes.io/projected/df5a8995-658c-4525-93ac-604d3c2af213-kube-api-access-zdjhv\") pod \"designate-operator-controller-manager-78b4bc895b-z8q57\" (UID: \"df5a8995-658c-4525-93ac-604d3c2af213\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.524603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfl6\" (UniqueName: \"kubernetes.io/projected/ec5df28d-5944-43f3-bf28-12e1062b1060-kube-api-access-kqfl6\") pod \"barbican-operator-controller-manager-7d9dfd778-jtqt7\" (UID: \"ec5df28d-5944-43f3-bf28-12e1062b1060\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.545515 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.556635 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7f6\" (UniqueName: \"kubernetes.io/projected/2ceaf037-5fce-4ef5-b273-724eb446e0af-kube-api-access-pn7f6\") pod \"glance-operator-controller-manager-77987cd8cd-d7zhq\" (UID: \"2ceaf037-5fce-4ef5-b273-724eb446e0af\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.563681 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4nb7\" (UniqueName: \"kubernetes.io/projected/61813ce8-b03b-473b-9606-22515ab1de03-kube-api-access-h4nb7\") pod \"cinder-operator-controller-manager-859b6ccc6-8h27m\" (UID: \"61813ce8-b03b-473b-9606-22515ab1de03\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.564546 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.564672 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.565967 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjhv\" (UniqueName: \"kubernetes.io/projected/df5a8995-658c-4525-93ac-604d3c2af213-kube-api-access-zdjhv\") pod \"designate-operator-controller-manager-78b4bc895b-z8q57\" (UID: \"df5a8995-658c-4525-93ac-604d3c2af213\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.568875 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.571955 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.573563 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dqw7b" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.578682 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfl6\" (UniqueName: \"kubernetes.io/projected/ec5df28d-5944-43f3-bf28-12e1062b1060-kube-api-access-kqfl6\") pod \"barbican-operator-controller-manager-7d9dfd778-jtqt7\" (UID: \"ec5df28d-5944-43f3-bf28-12e1062b1060\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.588361 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.588674 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-p926w" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.593473 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.606810 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.619039 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.622537 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.628422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5vsc\" (UniqueName: \"kubernetes.io/projected/25849bc1-46e2-4ff1-a61a-f0b7105290bf-kube-api-access-j5vsc\") pod \"heat-operator-controller-manager-5f64f6f8bb-x9jlc\" (UID: \"25849bc1-46e2-4ff1-a61a-f0b7105290bf\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.628518 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-665c5\" (UniqueName: \"kubernetes.io/projected/34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe-kube-api-access-665c5\") pod \"horizon-operator-controller-manager-68c6d99b8f-z6kf6\" (UID: \"34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.628546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmb6n\" (UniqueName: \"kubernetes.io/projected/58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1-kube-api-access-jmb6n\") pod \"ironic-operator-controller-manager-6c548fd776-ldf84\" (UID: \"58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.628582 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.628620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4fd\" (UniqueName: \"kubernetes.io/projected/a0857db7-00e4-410c-b5a2-945a46ae175a-kube-api-access-cg4fd\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.658119 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.672646 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.681058 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.682571 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.683988 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.687359 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vdb4v" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.717749 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.719080 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.724767 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-p2zqx" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.743654 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-665c5\" (UniqueName: \"kubernetes.io/projected/34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe-kube-api-access-665c5\") pod \"horizon-operator-controller-manager-68c6d99b8f-z6kf6\" (UID: \"34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.745024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmb6n\" (UniqueName: \"kubernetes.io/projected/58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1-kube-api-access-jmb6n\") pod \"ironic-operator-controller-manager-6c548fd776-ldf84\" (UID: \"58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.745259 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.745467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4fd\" (UniqueName: \"kubernetes.io/projected/a0857db7-00e4-410c-b5a2-945a46ae175a-kube-api-access-cg4fd\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.745698 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5vsc\" (UniqueName: \"kubernetes.io/projected/25849bc1-46e2-4ff1-a61a-f0b7105290bf-kube-api-access-j5vsc\") pod \"heat-operator-controller-manager-5f64f6f8bb-x9jlc\" (UID: \"25849bc1-46e2-4ff1-a61a-f0b7105290bf\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:56:35 crc kubenswrapper[4776]: E1204 09:56:35.745789 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:35 crc kubenswrapper[4776]: E1204 09:56:35.745857 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert podName:a0857db7-00e4-410c-b5a2-945a46ae175a nodeName:}" failed. No retries permitted until 2025-12-04 09:56:36.245835168 +0000 UTC m=+1041.112315735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert") pod "infra-operator-controller-manager-57548d458d-fk6f5" (UID: "a0857db7-00e4-410c-b5a2-945a46ae175a") : secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.748122 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.749730 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.753896 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ltzrx" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.767905 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.779497 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.787317 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4fd\" (UniqueName: \"kubernetes.io/projected/a0857db7-00e4-410c-b5a2-945a46ae175a-kube-api-access-cg4fd\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.790867 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmb6n\" (UniqueName: \"kubernetes.io/projected/58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1-kube-api-access-jmb6n\") pod \"ironic-operator-controller-manager-6c548fd776-ldf84\" (UID: \"58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.794727 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5vsc\" (UniqueName: \"kubernetes.io/projected/25849bc1-46e2-4ff1-a61a-f0b7105290bf-kube-api-access-j5vsc\") pod \"heat-operator-controller-manager-5f64f6f8bb-x9jlc\" (UID: \"25849bc1-46e2-4ff1-a61a-f0b7105290bf\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.798471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-665c5\" (UniqueName: \"kubernetes.io/projected/34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe-kube-api-access-665c5\") pod \"horizon-operator-controller-manager-68c6d99b8f-z6kf6\" (UID: \"34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.819029 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.822371 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.826431 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9s2bd" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.826580 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.832610 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.853259 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.855151 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.857329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4xwh\" (UniqueName: \"kubernetes.io/projected/6171555b-a2ba-4177-b7d7-3bb5496a99bd-kube-api-access-w4xwh\") pod \"keystone-operator-controller-manager-7765d96ddf-zq7wg\" (UID: \"6171555b-a2ba-4177-b7d7-3bb5496a99bd\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.857382 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfnt\" (UniqueName: \"kubernetes.io/projected/eca2af80-0e84-4615-9bd7-a907029259e7-kube-api-access-kgfnt\") pod \"manila-operator-controller-manager-79d898f8f7-lbtlb\" (UID: \"eca2af80-0e84-4615-9bd7-a907029259e7\") " pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.872467 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hcrmk" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.882228 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.892641 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.897286 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.900194 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wjbtw" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.910146 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.911320 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.917005 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.924310 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.926558 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.926764 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6f96q" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.930591 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.932021 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.935874 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-q7x7d" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.952534 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.959528 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwstb\" (UniqueName: \"kubernetes.io/projected/23b5c3d3-b677-4440-b489-9e1811b722bb-kube-api-access-lwstb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-4d8fg\" (UID: \"23b5c3d3-b677-4440-b489-9e1811b722bb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.959577 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ft6k\" (UniqueName: \"kubernetes.io/projected/17848cf1-eceb-4e3e-9e39-40a7e4507d6b-kube-api-access-7ft6k\") pod \"nova-operator-controller-manager-697bc559fc-ft7rc\" (UID: \"17848cf1-eceb-4e3e-9e39-40a7e4507d6b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.959619 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4xwh\" (UniqueName: \"kubernetes.io/projected/6171555b-a2ba-4177-b7d7-3bb5496a99bd-kube-api-access-w4xwh\") pod \"keystone-operator-controller-manager-7765d96ddf-zq7wg\" (UID: \"6171555b-a2ba-4177-b7d7-3bb5496a99bd\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.959682 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfnt\" (UniqueName: \"kubernetes.io/projected/eca2af80-0e84-4615-9bd7-a907029259e7-kube-api-access-kgfnt\") pod \"manila-operator-controller-manager-79d898f8f7-lbtlb\" (UID: \"eca2af80-0e84-4615-9bd7-a907029259e7\") " pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.959744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrd7\" (UniqueName: \"kubernetes.io/projected/fe5ac80c-367a-489b-901e-76d872a26e4b-kube-api-access-4lrd7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-5l4h4\" (UID: \"fe5ac80c-367a-489b-901e-76d872a26e4b\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.982013 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.983315 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.992297 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc"] Dec 04 09:56:35 crc kubenswrapper[4776]: I1204 09:56:35.993717 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.049909 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.050044 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.072192 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.078322 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-kgkgg" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.083435 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-z8b6r" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.115862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfnt\" (UniqueName: \"kubernetes.io/projected/eca2af80-0e84-4615-9bd7-a907029259e7-kube-api-access-kgfnt\") pod \"manila-operator-controller-manager-79d898f8f7-lbtlb\" (UID: \"eca2af80-0e84-4615-9bd7-a907029259e7\") " pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.126790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4xwh\" (UniqueName: \"kubernetes.io/projected/6171555b-a2ba-4177-b7d7-3bb5496a99bd-kube-api-access-w4xwh\") pod \"keystone-operator-controller-manager-7765d96ddf-zq7wg\" (UID: \"6171555b-a2ba-4177-b7d7-3bb5496a99bd\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.128258 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.195348 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.234660 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj86q\" (UniqueName: \"kubernetes.io/projected/ec5e5439-8cfc-4e75-9627-45e4999aacea-kube-api-access-qj86q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lrd7\" (UniqueName: \"kubernetes.io/projected/fe5ac80c-367a-489b-901e-76d872a26e4b-kube-api-access-4lrd7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-5l4h4\" (UID: \"fe5ac80c-367a-489b-901e-76d872a26e4b\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236371 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236414 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236447 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwstb\" (UniqueName: \"kubernetes.io/projected/23b5c3d3-b677-4440-b489-9e1811b722bb-kube-api-access-lwstb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-4d8fg\" (UID: \"23b5c3d3-b677-4440-b489-9e1811b722bb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236482 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ft6k\" (UniqueName: \"kubernetes.io/projected/17848cf1-eceb-4e3e-9e39-40a7e4507d6b-kube-api-access-7ft6k\") pod \"nova-operator-controller-manager-697bc559fc-ft7rc\" (UID: \"17848cf1-eceb-4e3e-9e39-40a7e4507d6b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.236503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dkhv\" (UniqueName: \"kubernetes.io/projected/115873e4-456f-4d60-84f0-182f467cb8c0-kube-api-access-9dkhv\") pod \"octavia-operator-controller-manager-998648c74-mlnr6\" (UID: \"115873e4-456f-4d60-84f0-182f467cb8c0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.237252 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9kn\" (UniqueName: \"kubernetes.io/projected/50a0ede3-8c98-47c6-945e-6aeefa27f86e-kube-api-access-xm9kn\") pod \"ovn-operator-controller-manager-b6456fdb6-hscd7\" (UID: \"50a0ede3-8c98-47c6-945e-6aeefa27f86e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.239905 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-srgh2" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.252341 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.264740 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwstb\" (UniqueName: \"kubernetes.io/projected/23b5c3d3-b677-4440-b489-9e1811b722bb-kube-api-access-lwstb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-4d8fg\" (UID: \"23b5c3d3-b677-4440-b489-9e1811b722bb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.272098 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.275543 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.282386 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4wl7x" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.282658 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ft6k\" (UniqueName: \"kubernetes.io/projected/17848cf1-eceb-4e3e-9e39-40a7e4507d6b-kube-api-access-7ft6k\") pod \"nova-operator-controller-manager-697bc559fc-ft7rc\" (UID: \"17848cf1-eceb-4e3e-9e39-40a7e4507d6b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.283630 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lrd7\" (UniqueName: \"kubernetes.io/projected/fe5ac80c-367a-489b-901e-76d872a26e4b-kube-api-access-4lrd7\") pod \"mariadb-operator-controller-manager-56bbcc9d85-5l4h4\" (UID: \"fe5ac80c-367a-489b-901e-76d872a26e4b\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.292240 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.311128 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.312110 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.313937 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fv8pt" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.327229 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338074 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338128 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dkhv\" (UniqueName: \"kubernetes.io/projected/115873e4-456f-4d60-84f0-182f467cb8c0-kube-api-access-9dkhv\") pod \"octavia-operator-controller-manager-998648c74-mlnr6\" (UID: \"115873e4-456f-4d60-84f0-182f467cb8c0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338152 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9kn\" (UniqueName: \"kubernetes.io/projected/50a0ede3-8c98-47c6-945e-6aeefa27f86e-kube-api-access-xm9kn\") pod \"ovn-operator-controller-manager-b6456fdb6-hscd7\" (UID: \"50a0ede3-8c98-47c6-945e-6aeefa27f86e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338192 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjqf\" (UniqueName: \"kubernetes.io/projected/6bab5c22-f51d-4049-adb5-343a7195eeb7-kube-api-access-ntjqf\") pod \"watcher-operator-controller-manager-769dc69bc-4c2d9\" (UID: \"6bab5c22-f51d-4049-adb5-343a7195eeb7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338267 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xdxq\" (UniqueName: \"kubernetes.io/projected/725f674d-7785-4bb1-95d2-2a650b9f4df8-kube-api-access-9xdxq\") pod \"telemetry-operator-controller-manager-76cc84c6bb-mkn8n\" (UID: \"725f674d-7785-4bb1-95d2-2a650b9f4df8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338298 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj86q\" (UniqueName: \"kubernetes.io/projected/ec5e5439-8cfc-4e75-9627-45e4999aacea-kube-api-access-qj86q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvtc\" (UniqueName: \"kubernetes.io/projected/f6f8f6ca-820b-41e8-af0a-aa6b439a3dad-kube-api-access-vtvtc\") pod \"test-operator-controller-manager-5854674fcc-wbcs6\" (UID: \"f6f8f6ca-820b-41e8-af0a-aa6b439a3dad\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338341 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4dv\" (UniqueName: \"kubernetes.io/projected/c0269b5f-db90-427e-933b-6221bcfbde9e-kube-api-access-9z4dv\") pod \"swift-operator-controller-manager-5f8c65bbfc-8wwhc\" (UID: \"c0269b5f-db90-427e-933b-6221bcfbde9e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.338358 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7htf\" (UniqueName: \"kubernetes.io/projected/8f26eb91-a638-4ba9-9547-7bef2c5513c4-kube-api-access-x7htf\") pod \"placement-operator-controller-manager-78f8948974-pzjlr\" (UID: \"8f26eb91-a638-4ba9-9547-7bef2c5513c4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.338480 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.338517 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert podName:ec5e5439-8cfc-4e75-9627-45e4999aacea nodeName:}" failed. No retries permitted until 2025-12-04 09:56:36.838502897 +0000 UTC m=+1041.704983274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" (UID: "ec5e5439-8cfc-4e75-9627-45e4999aacea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.338978 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.339004 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert podName:a0857db7-00e4-410c-b5a2-945a46ae175a nodeName:}" failed. No retries permitted until 2025-12-04 09:56:37.338995523 +0000 UTC m=+1042.205475900 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert") pod "infra-operator-controller-manager-57548d458d-fk6f5" (UID: "a0857db7-00e4-410c-b5a2-945a46ae175a") : secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.351265 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.352404 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.357865 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.358122 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.358730 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vst62" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.372365 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.391514 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.392177 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.396537 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dkhv\" (UniqueName: \"kubernetes.io/projected/115873e4-456f-4d60-84f0-182f467cb8c0-kube-api-access-9dkhv\") pod \"octavia-operator-controller-manager-998648c74-mlnr6\" (UID: \"115873e4-456f-4d60-84f0-182f467cb8c0\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.397487 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9kn\" (UniqueName: \"kubernetes.io/projected/50a0ede3-8c98-47c6-945e-6aeefa27f86e-kube-api-access-xm9kn\") pod \"ovn-operator-controller-manager-b6456fdb6-hscd7\" (UID: \"50a0ede3-8c98-47c6-945e-6aeefa27f86e\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.403052 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj86q\" (UniqueName: \"kubernetes.io/projected/ec5e5439-8cfc-4e75-9627-45e4999aacea-kube-api-access-qj86q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.403390 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439170 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjqf\" (UniqueName: \"kubernetes.io/projected/6bab5c22-f51d-4049-adb5-343a7195eeb7-kube-api-access-ntjqf\") pod \"watcher-operator-controller-manager-769dc69bc-4c2d9\" (UID: \"6bab5c22-f51d-4049-adb5-343a7195eeb7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439191 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xdxq\" (UniqueName: \"kubernetes.io/projected/725f674d-7785-4bb1-95d2-2a650b9f4df8-kube-api-access-9xdxq\") pod \"telemetry-operator-controller-manager-76cc84c6bb-mkn8n\" (UID: \"725f674d-7785-4bb1-95d2-2a650b9f4df8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439209 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvtc\" (UniqueName: \"kubernetes.io/projected/f6f8f6ca-820b-41e8-af0a-aa6b439a3dad-kube-api-access-vtvtc\") pod \"test-operator-controller-manager-5854674fcc-wbcs6\" (UID: \"f6f8f6ca-820b-41e8-af0a-aa6b439a3dad\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4dv\" (UniqueName: \"kubernetes.io/projected/c0269b5f-db90-427e-933b-6221bcfbde9e-kube-api-access-9z4dv\") pod \"swift-operator-controller-manager-5f8c65bbfc-8wwhc\" (UID: \"c0269b5f-db90-427e-933b-6221bcfbde9e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439252 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7htf\" (UniqueName: \"kubernetes.io/projected/8f26eb91-a638-4ba9-9547-7bef2c5513c4-kube-api-access-x7htf\") pod \"placement-operator-controller-manager-78f8948974-pzjlr\" (UID: \"8f26eb91-a638-4ba9-9547-7bef2c5513c4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439274 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.439312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g4pd\" (UniqueName: \"kubernetes.io/projected/72061fb8-5546-4ced-ba4a-f7faeeebec85-kube-api-access-9g4pd\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.442998 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.443861 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.446390 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.450162 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-f2884" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.461028 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.461762 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjqf\" (UniqueName: \"kubernetes.io/projected/6bab5c22-f51d-4049-adb5-343a7195eeb7-kube-api-access-ntjqf\") pod \"watcher-operator-controller-manager-769dc69bc-4c2d9\" (UID: \"6bab5c22-f51d-4049-adb5-343a7195eeb7\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.474663 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvtc\" (UniqueName: \"kubernetes.io/projected/f6f8f6ca-820b-41e8-af0a-aa6b439a3dad-kube-api-access-vtvtc\") pod \"test-operator-controller-manager-5854674fcc-wbcs6\" (UID: \"f6f8f6ca-820b-41e8-af0a-aa6b439a3dad\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.483703 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4dv\" (UniqueName: \"kubernetes.io/projected/c0269b5f-db90-427e-933b-6221bcfbde9e-kube-api-access-9z4dv\") pod \"swift-operator-controller-manager-5f8c65bbfc-8wwhc\" (UID: \"c0269b5f-db90-427e-933b-6221bcfbde9e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.491823 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xdxq\" (UniqueName: \"kubernetes.io/projected/725f674d-7785-4bb1-95d2-2a650b9f4df8-kube-api-access-9xdxq\") pod \"telemetry-operator-controller-manager-76cc84c6bb-mkn8n\" (UID: \"725f674d-7785-4bb1-95d2-2a650b9f4df8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.500593 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7htf\" (UniqueName: \"kubernetes.io/projected/8f26eb91-a638-4ba9-9547-7bef2c5513c4-kube-api-access-x7htf\") pod \"placement-operator-controller-manager-78f8948974-pzjlr\" (UID: \"8f26eb91-a638-4ba9-9547-7bef2c5513c4\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.510295 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.540117 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.540186 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g4pd\" (UniqueName: \"kubernetes.io/projected/72061fb8-5546-4ced-ba4a-f7faeeebec85-kube-api-access-9g4pd\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.540252 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.540272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66m6t\" (UniqueName: \"kubernetes.io/projected/615d312b-bd1f-40c3-b499-a7c4ae351cd3-kube-api-access-66m6t\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s767z\" (UID: \"615d312b-bd1f-40c3-b499-a7c4ae351cd3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.540533 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.540570 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:37.040555885 +0000 UTC m=+1041.907036262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "metrics-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.541486 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.541533 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:37.041518235 +0000 UTC m=+1041.907998612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.556951 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.564680 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.565450 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.590274 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.591703 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.601517 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g4pd\" (UniqueName: \"kubernetes.io/projected/72061fb8-5546-4ced-ba4a-f7faeeebec85-kube-api-access-9g4pd\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.605849 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.640614 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.642208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66m6t\" (UniqueName: \"kubernetes.io/projected/615d312b-bd1f-40c3-b499-a7c4ae351cd3-kube-api-access-66m6t\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s767z\" (UID: \"615d312b-bd1f-40c3-b499-a7c4ae351cd3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.669709 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.680164 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66m6t\" (UniqueName: \"kubernetes.io/projected/615d312b-bd1f-40c3-b499-a7c4ae351cd3-kube-api-access-66m6t\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s767z\" (UID: \"615d312b-bd1f-40c3-b499-a7c4ae351cd3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.721132 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.755594 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.805742 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.848615 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.855833 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc"] Dec 04 09:56:36 crc kubenswrapper[4776]: I1204 09:56:36.869166 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.869337 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: E1204 09:56:36.869380 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert podName:ec5e5439-8cfc-4e75-9627-45e4999aacea nodeName:}" failed. No retries permitted until 2025-12-04 09:56:37.869366676 +0000 UTC m=+1042.735847053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" (UID: "ec5e5439-8cfc-4e75-9627-45e4999aacea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:36 crc kubenswrapper[4776]: W1204 09:56:36.872469 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ceaf037_5fce_4ef5_b273_724eb446e0af.slice/crio-9f9b93665615ee77efb1e42eb2b81a7739fbe0686a50809a1d3f59f0b1000462 WatchSource:0}: Error finding container 9f9b93665615ee77efb1e42eb2b81a7739fbe0686a50809a1d3f59f0b1000462: Status 404 returned error can't find the container with id 9f9b93665615ee77efb1e42eb2b81a7739fbe0686a50809a1d3f59f0b1000462 Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.098120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.098603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.098783 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.098840 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:38.09882365 +0000 UTC m=+1042.965304027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "webhook-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.101271 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.103543 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:38.103502856 +0000 UTC m=+1042.969983233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "metrics-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.168239 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.355745 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" event={"ID":"df5a8995-658c-4525-93ac-604d3c2af213","Type":"ContainerStarted","Data":"ce23ca1f3e0fb72a126a9340e9bf97f8d7b224f5d4a05fd741192db0f8f38b42"} Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.375511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" event={"ID":"2ceaf037-5fce-4ef5-b273-724eb446e0af","Type":"ContainerStarted","Data":"9f9b93665615ee77efb1e42eb2b81a7739fbe0686a50809a1d3f59f0b1000462"} Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.391675 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" event={"ID":"ec5df28d-5944-43f3-bf28-12e1062b1060","Type":"ContainerStarted","Data":"715e5d4a0ab934921a35b046c6b5dfa0ee0fa6222633ed5f1d8c85fd68ce7967"} Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.403152 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" event={"ID":"25849bc1-46e2-4ff1-a61a-f0b7105290bf","Type":"ContainerStarted","Data":"c93d3677dfaaa9df44c137b69a47b0f2dd25902f8e61b02b19d43be418657f00"} Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.447498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.447765 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.447843 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert podName:a0857db7-00e4-410c-b5a2-945a46ae175a nodeName:}" failed. No retries permitted until 2025-12-04 09:56:39.447815871 +0000 UTC m=+1044.314296248 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert") pod "infra-operator-controller-manager-57548d458d-fk6f5" (UID: "a0857db7-00e4-410c-b5a2-945a46ae175a") : secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.450773 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" event={"ID":"61813ce8-b03b-473b-9606-22515ab1de03","Type":"ContainerStarted","Data":"51fe7dd57edaac3bf1f6cd25b51de2e1d6bdbad9cad75c08533ef0e1cf350d25"} Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.461953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" event={"ID":"34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe","Type":"ContainerStarted","Data":"c228997c8636a6d67562862c71c5c5c999e3b2b811c3c788c89ebc7ad5d3f537"} Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.693427 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.699697 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.709970 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.724188 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.737391 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.777038 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg"] Dec 04 09:56:37 crc kubenswrapper[4776]: I1204 09:56:37.871793 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.872059 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:37 crc kubenswrapper[4776]: E1204 09:56:37.872105 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert podName:ec5e5439-8cfc-4e75-9627-45e4999aacea nodeName:}" failed. No retries permitted until 2025-12-04 09:56:39.872091426 +0000 UTC m=+1044.738571803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" (UID: "ec5e5439-8cfc-4e75-9627-45e4999aacea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.192167 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:38 crc kubenswrapper[4776]: E1204 09:56:38.193376 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.195630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:38 crc kubenswrapper[4776]: E1204 09:56:38.195675 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:40.195629691 +0000 UTC m=+1045.062110058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "metrics-server-cert" not found Dec 04 09:56:38 crc kubenswrapper[4776]: E1204 09:56:38.195880 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 09:56:38 crc kubenswrapper[4776]: E1204 09:56:38.195947 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:40.195932971 +0000 UTC m=+1045.062413348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "webhook-server-cert" not found Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.472053 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" event={"ID":"58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1","Type":"ContainerStarted","Data":"49ff05bdc6fda84e7e14cffd683b2078e95638f891d6567b07bac7d228f00c5b"} Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.473599 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" event={"ID":"6171555b-a2ba-4177-b7d7-3bb5496a99bd","Type":"ContainerStarted","Data":"4c332b51800ced1a445dbb40cae427a1df4074a0910bb3bc57be77de887cce8b"} Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.475312 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" event={"ID":"50a0ede3-8c98-47c6-945e-6aeefa27f86e","Type":"ContainerStarted","Data":"74fd64b1bb4ca1e311100a0420659033f0ac1daf736b852bc9729d0f9663f7ac"} Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.478829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" event={"ID":"23b5c3d3-b677-4440-b489-9e1811b722bb","Type":"ContainerStarted","Data":"212517ab39e7c082643f33917d0779d7418cca0f51fe211e80cac827681f8499"} Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.489012 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" event={"ID":"fe5ac80c-367a-489b-901e-76d872a26e4b","Type":"ContainerStarted","Data":"af31b9b95a89faf7306392431be81f348a9dd811b9a0eebb04279f278c07be6e"} Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.508515 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6"] Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.520445 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc"] Dec 04 09:56:38 crc kubenswrapper[4776]: W1204 09:56:38.534578 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115873e4_456f_4d60_84f0_182f467cb8c0.slice/crio-a410ef9eea610f047177d9eb42b1c7c70661bedf504086cf82101b823d2fb839 WatchSource:0}: Error finding container a410ef9eea610f047177d9eb42b1c7c70661bedf504086cf82101b823d2fb839: Status 404 returned error can't find the container with id a410ef9eea610f047177d9eb42b1c7c70661bedf504086cf82101b823d2fb839 Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.536558 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z"] Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.550377 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n"] Dec 04 09:56:38 crc kubenswrapper[4776]: W1204 09:56:38.555748 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17848cf1_eceb_4e3e_9e39_40a7e4507d6b.slice/crio-073e686036cfa248748c5d1dce7fab39e2002ed4de85f4d341e4e1e8a4724009 WatchSource:0}: Error finding container 073e686036cfa248748c5d1dce7fab39e2002ed4de85f4d341e4e1e8a4724009: Status 404 returned error can't find the container with id 073e686036cfa248748c5d1dce7fab39e2002ed4de85f4d341e4e1e8a4724009 Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.563801 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9"] Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.564306 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" event={"ID":"eca2af80-0e84-4615-9bd7-a907029259e7","Type":"ContainerStarted","Data":"73813e2e4b5522dcd4c8e1764b4196b60250c868cfd3e2563e5e08e59debd45c"} Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.575991 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6"] Dec 04 09:56:38 crc kubenswrapper[4776]: W1204 09:56:38.590036 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bab5c22_f51d_4049_adb5_343a7195eeb7.slice/crio-725a5e7b0302a5ecc9b1846fc20bf3738c62f856d0f6aa7643d25f5430b76b60 WatchSource:0}: Error finding container 725a5e7b0302a5ecc9b1846fc20bf3738c62f856d0f6aa7643d25f5430b76b60: Status 404 returned error can't find the container with id 725a5e7b0302a5ecc9b1846fc20bf3738c62f856d0f6aa7643d25f5430b76b60 Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.597843 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr"] Dec 04 09:56:38 crc kubenswrapper[4776]: W1204 09:56:38.602119 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod615d312b_bd1f_40c3_b499_a7c4ae351cd3.slice/crio-dbfd433e9d511afcf13ffd001a6c66e37ef348b78e4d37a56002d9a7e5b2de48 WatchSource:0}: Error finding container dbfd433e9d511afcf13ffd001a6c66e37ef348b78e4d37a56002d9a7e5b2de48: Status 404 returned error can't find the container with id dbfd433e9d511afcf13ffd001a6c66e37ef348b78e4d37a56002d9a7e5b2de48 Dec 04 09:56:38 crc kubenswrapper[4776]: W1204 09:56:38.604736 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f8f6ca_820b_41e8_af0a_aa6b439a3dad.slice/crio-dba697a297ab6d49d56a7ffd6a4d1e89e581d00a8b625a82122153ea48ff2353 WatchSource:0}: Error finding container dba697a297ab6d49d56a7ffd6a4d1e89e581d00a8b625a82122153ea48ff2353: Status 404 returned error can't find the container with id dba697a297ab6d49d56a7ffd6a4d1e89e581d00a8b625a82122153ea48ff2353 Dec 04 09:56:38 crc kubenswrapper[4776]: I1204 09:56:38.638987 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc"] Dec 04 09:56:38 crc kubenswrapper[4776]: W1204 09:56:38.685250 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod725f674d_7785_4bb1_95d2_2a650b9f4df8.slice/crio-854ad61a506dc44143092fac52aa4e355542321c269c30a52c1f35cd5080849b WatchSource:0}: Error finding container 854ad61a506dc44143092fac52aa4e355542321c269c30a52c1f35cd5080849b: Status 404 returned error can't find the container with id 854ad61a506dc44143092fac52aa4e355542321c269c30a52c1f35cd5080849b Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.462433 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:39 crc kubenswrapper[4776]: E1204 09:56:39.462574 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:39 crc kubenswrapper[4776]: E1204 09:56:39.462615 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert podName:a0857db7-00e4-410c-b5a2-945a46ae175a nodeName:}" failed. No retries permitted until 2025-12-04 09:56:43.462600573 +0000 UTC m=+1048.329080950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert") pod "infra-operator-controller-manager-57548d458d-fk6f5" (UID: "a0857db7-00e4-410c-b5a2-945a46ae175a") : secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.684932 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" event={"ID":"615d312b-bd1f-40c3-b499-a7c4ae351cd3","Type":"ContainerStarted","Data":"dbfd433e9d511afcf13ffd001a6c66e37ef348b78e4d37a56002d9a7e5b2de48"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.688351 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" event={"ID":"17848cf1-eceb-4e3e-9e39-40a7e4507d6b","Type":"ContainerStarted","Data":"073e686036cfa248748c5d1dce7fab39e2002ed4de85f4d341e4e1e8a4724009"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.690557 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" event={"ID":"6bab5c22-f51d-4049-adb5-343a7195eeb7","Type":"ContainerStarted","Data":"725a5e7b0302a5ecc9b1846fc20bf3738c62f856d0f6aa7643d25f5430b76b60"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.697289 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" event={"ID":"725f674d-7785-4bb1-95d2-2a650b9f4df8","Type":"ContainerStarted","Data":"854ad61a506dc44143092fac52aa4e355542321c269c30a52c1f35cd5080849b"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.699636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" event={"ID":"8f26eb91-a638-4ba9-9547-7bef2c5513c4","Type":"ContainerStarted","Data":"38221252fcb87fce38ccd918839e7f00898d22cd230385b3215054da10e31be4"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.709805 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" event={"ID":"f6f8f6ca-820b-41e8-af0a-aa6b439a3dad","Type":"ContainerStarted","Data":"dba697a297ab6d49d56a7ffd6a4d1e89e581d00a8b625a82122153ea48ff2353"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.712199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" event={"ID":"c0269b5f-db90-427e-933b-6221bcfbde9e","Type":"ContainerStarted","Data":"eaa22e42b188997e95dd5732d8f9e95af6246188ccd86bdaa34128c553d8b676"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.714844 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" event={"ID":"115873e4-456f-4d60-84f0-182f467cb8c0","Type":"ContainerStarted","Data":"a410ef9eea610f047177d9eb42b1c7c70661bedf504086cf82101b823d2fb839"} Dec 04 09:56:39 crc kubenswrapper[4776]: I1204 09:56:39.882238 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:39 crc kubenswrapper[4776]: E1204 09:56:39.882490 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:39 crc kubenswrapper[4776]: E1204 09:56:39.882547 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert podName:ec5e5439-8cfc-4e75-9627-45e4999aacea nodeName:}" failed. No retries permitted until 2025-12-04 09:56:43.882529853 +0000 UTC m=+1048.749010230 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" (UID: "ec5e5439-8cfc-4e75-9627-45e4999aacea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:40 crc kubenswrapper[4776]: I1204 09:56:40.286222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:40 crc kubenswrapper[4776]: I1204 09:56:40.286326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:40 crc kubenswrapper[4776]: E1204 09:56:40.286463 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 09:56:40 crc kubenswrapper[4776]: E1204 09:56:40.286514 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:44.286498903 +0000 UTC m=+1049.152979280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "webhook-server-cert" not found Dec 04 09:56:40 crc kubenswrapper[4776]: E1204 09:56:40.287399 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 09:56:40 crc kubenswrapper[4776]: E1204 09:56:40.287435 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:44.287427043 +0000 UTC m=+1049.153907420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "metrics-server-cert" not found Dec 04 09:56:43 crc kubenswrapper[4776]: I1204 09:56:43.497456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:43 crc kubenswrapper[4776]: E1204 09:56:43.497939 4776 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:43 crc kubenswrapper[4776]: E1204 09:56:43.498453 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert podName:a0857db7-00e4-410c-b5a2-945a46ae175a nodeName:}" failed. No retries permitted until 2025-12-04 09:56:51.498423166 +0000 UTC m=+1056.364903543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert") pod "infra-operator-controller-manager-57548d458d-fk6f5" (UID: "a0857db7-00e4-410c-b5a2-945a46ae175a") : secret "infra-operator-webhook-server-cert" not found Dec 04 09:56:43 crc kubenswrapper[4776]: I1204 09:56:43.904044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:43 crc kubenswrapper[4776]: E1204 09:56:43.904281 4776 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:43 crc kubenswrapper[4776]: E1204 09:56:43.904330 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert podName:ec5e5439-8cfc-4e75-9627-45e4999aacea nodeName:}" failed. No retries permitted until 2025-12-04 09:56:51.904315716 +0000 UTC m=+1056.770796093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" (UID: "ec5e5439-8cfc-4e75-9627-45e4999aacea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 09:56:44 crc kubenswrapper[4776]: I1204 09:56:44.310277 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:44 crc kubenswrapper[4776]: I1204 09:56:44.310443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:44 crc kubenswrapper[4776]: E1204 09:56:44.310500 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 09:56:44 crc kubenswrapper[4776]: E1204 09:56:44.310617 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 09:56:44 crc kubenswrapper[4776]: E1204 09:56:44.310631 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:52.310604499 +0000 UTC m=+1057.177084876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "metrics-server-cert" not found Dec 04 09:56:44 crc kubenswrapper[4776]: E1204 09:56:44.310704 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:56:52.310683542 +0000 UTC m=+1057.177163929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "webhook-server-cert" not found Dec 04 09:56:51 crc kubenswrapper[4776]: I1204 09:56:51.590647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:51 crc kubenswrapper[4776]: I1204 09:56:51.598243 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0857db7-00e4-410c-b5a2-945a46ae175a-cert\") pod \"infra-operator-controller-manager-57548d458d-fk6f5\" (UID: \"a0857db7-00e4-410c-b5a2-945a46ae175a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:51 crc kubenswrapper[4776]: I1204 09:56:51.843246 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:56:51 crc kubenswrapper[4776]: I1204 09:56:51.995255 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:52 crc kubenswrapper[4776]: I1204 09:56:52.001958 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5e5439-8cfc-4e75-9627-45e4999aacea-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj\" (UID: \"ec5e5439-8cfc-4e75-9627-45e4999aacea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:52 crc kubenswrapper[4776]: I1204 09:56:52.296045 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:56:52 crc kubenswrapper[4776]: I1204 09:56:52.401365 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:52 crc kubenswrapper[4776]: I1204 09:56:52.401473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:56:52 crc kubenswrapper[4776]: E1204 09:56:52.401571 4776 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 09:56:52 crc kubenswrapper[4776]: E1204 09:56:52.401667 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:57:08.40164477 +0000 UTC m=+1073.268125217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "metrics-server-cert" not found Dec 04 09:56:52 crc kubenswrapper[4776]: E1204 09:56:52.401594 4776 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 09:56:52 crc kubenswrapper[4776]: E1204 09:56:52.402079 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs podName:72061fb8-5546-4ced-ba4a-f7faeeebec85 nodeName:}" failed. No retries permitted until 2025-12-04 09:57:08.402068422 +0000 UTC m=+1073.268548799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs") pod "openstack-operator-controller-manager-7b845677-nvxnd" (UID: "72061fb8-5546-4ced-ba4a-f7faeeebec85") : secret "webhook-server-cert" not found Dec 04 09:56:55 crc kubenswrapper[4776]: E1204 09:56:55.475143 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 04 09:56:55 crc kubenswrapper[4776]: E1204 09:56:55.475486 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdjhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-z8q57_openstack-operators(df5a8995-658c-4525-93ac-604d3c2af213): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:56:56 crc kubenswrapper[4776]: E1204 09:56:56.314249 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 04 09:56:56 crc kubenswrapper[4776]: E1204 09:56:56.314549 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmb6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-ldf84_openstack-operators(58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:56:56 crc kubenswrapper[4776]: E1204 09:56:56.892298 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 04 09:56:56 crc kubenswrapper[4776]: E1204 09:56:56.892510 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j5vsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-x9jlc_openstack-operators(25849bc1-46e2-4ff1-a61a-f0b7105290bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:56:57 crc kubenswrapper[4776]: E1204 09:56:57.641961 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 04 09:56:57 crc kubenswrapper[4776]: E1204 09:56:57.642195 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9dkhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-mlnr6_openstack-operators(115873e4-456f-4d60-84f0-182f467cb8c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:56:59 crc kubenswrapper[4776]: E1204 09:56:59.333255 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 04 09:56:59 crc kubenswrapper[4776]: E1204 09:56:59.333498 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xm9kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-hscd7_openstack-operators(50a0ede3-8c98-47c6-945e-6aeefa27f86e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:56:59 crc kubenswrapper[4776]: E1204 09:56:59.909296 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 04 09:56:59 crc kubenswrapper[4776]: E1204 09:56:59.909502 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4lrd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-5l4h4_openstack-operators(fe5ac80c-367a-489b-901e-76d872a26e4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:00 crc kubenswrapper[4776]: E1204 09:57:00.444555 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 04 09:57:00 crc kubenswrapper[4776]: E1204 09:57:00.445114 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9z4dv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-8wwhc_openstack-operators(c0269b5f-db90-427e-933b-6221bcfbde9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:01 crc kubenswrapper[4776]: E1204 09:57:01.034871 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 04 09:57:01 crc kubenswrapper[4776]: E1204 09:57:01.035135 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-665c5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-z6kf6_openstack-operators(34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:01 crc kubenswrapper[4776]: E1204 09:57:01.642602 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 04 09:57:01 crc kubenswrapper[4776]: E1204 09:57:01.642835 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xdxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-mkn8n_openstack-operators(725f674d-7785-4bb1-95d2-2a650b9f4df8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:02 crc kubenswrapper[4776]: E1204 09:57:02.980201 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 04 09:57:02 crc kubenswrapper[4776]: E1204 09:57:02.980429 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pn7f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-d7zhq_openstack-operators(2ceaf037-5fce-4ef5-b273-724eb446e0af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:03 crc kubenswrapper[4776]: E1204 09:57:03.341643 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 04 09:57:03 crc kubenswrapper[4776]: E1204 09:57:03.341902 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lwstb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-4d8fg_openstack-operators(23b5c3d3-b677-4440-b489-9e1811b722bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:04 crc kubenswrapper[4776]: E1204 09:57:04.937624 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 04 09:57:04 crc kubenswrapper[4776]: E1204 09:57:04.938315 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vtvtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wbcs6_openstack-operators(f6f8f6ca-820b-41e8-af0a-aa6b439a3dad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:05 crc kubenswrapper[4776]: E1204 09:57:05.565496 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 04 09:57:05 crc kubenswrapper[4776]: E1204 09:57:05.565679 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-66m6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-s767z_openstack-operators(615d312b-bd1f-40c3-b499-a7c4ae351cd3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:05 crc kubenswrapper[4776]: E1204 09:57:05.566823 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" podUID="615d312b-bd1f-40c3-b499-a7c4ae351cd3" Dec 04 09:57:06 crc kubenswrapper[4776]: E1204 09:57:06.026932 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" podUID="615d312b-bd1f-40c3-b499-a7c4ae351cd3" Dec 04 09:57:06 crc kubenswrapper[4776]: E1204 09:57:06.174857 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 04 09:57:06 crc kubenswrapper[4776]: E1204 09:57:06.175083 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ft6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-ft7rc_openstack-operators(17848cf1-eceb-4e3e-9e39-40a7e4507d6b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:06 crc kubenswrapper[4776]: E1204 09:57:06.259623 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.73:5001/openstack-k8s-operators/manila-operator:52249811d86b52a6ea89712db017ab950bb46e03" Dec 04 09:57:06 crc kubenswrapper[4776]: E1204 09:57:06.259697 4776 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.73:5001/openstack-k8s-operators/manila-operator:52249811d86b52a6ea89712db017ab950bb46e03" Dec 04 09:57:06 crc kubenswrapper[4776]: E1204 09:57:06.259858 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.73:5001/openstack-k8s-operators/manila-operator:52249811d86b52a6ea89712db017ab950bb46e03,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kgfnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-79d898f8f7-lbtlb_openstack-operators(eca2af80-0e84-4615-9bd7-a907029259e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:06 crc kubenswrapper[4776]: I1204 09:57:06.565235 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj"] Dec 04 09:57:06 crc kubenswrapper[4776]: I1204 09:57:06.943629 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5"] Dec 04 09:57:07 crc kubenswrapper[4776]: I1204 09:57:07.036846 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" event={"ID":"ec5df28d-5944-43f3-bf28-12e1062b1060","Type":"ContainerStarted","Data":"7684a2863cb431cb8e2af2c9dd0c9f1c59da338406250d957ccc42c4a17dcb00"} Dec 04 09:57:07 crc kubenswrapper[4776]: I1204 09:57:07.039077 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" event={"ID":"6bab5c22-f51d-4049-adb5-343a7195eeb7","Type":"ContainerStarted","Data":"2e31ec09988c17946e782d32829dbf085e5cdbcc5f44ca7629121525a3a64e70"} Dec 04 09:57:07 crc kubenswrapper[4776]: I1204 09:57:07.041027 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" event={"ID":"8f26eb91-a638-4ba9-9547-7bef2c5513c4","Type":"ContainerStarted","Data":"ffe03b005996ddb7a3a4ed61d661429defdb04cdb1faf5182088145c36c7772a"} Dec 04 09:57:07 crc kubenswrapper[4776]: I1204 09:57:07.042410 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" event={"ID":"61813ce8-b03b-473b-9606-22515ab1de03","Type":"ContainerStarted","Data":"18c20741222f2a69bf90070b54b4db55282ee2057e22b478ee215df61ef8f441"} Dec 04 09:57:07 crc kubenswrapper[4776]: I1204 09:57:07.046997 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" event={"ID":"ec5e5439-8cfc-4e75-9627-45e4999aacea","Type":"ContainerStarted","Data":"6946b0e641b73154b41629f5531a66169c1219778627e51d9c5a39fc4a71969c"} Dec 04 09:57:07 crc kubenswrapper[4776]: W1204 09:57:07.426473 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0857db7_00e4_410c_b5a2_945a46ae175a.slice/crio-25a4c9bdea890b039a56bd72c3910d8d753c36f3f10b132f52230bdde0b28243 WatchSource:0}: Error finding container 25a4c9bdea890b039a56bd72c3910d8d753c36f3f10b132f52230bdde0b28243: Status 404 returned error can't find the container with id 25a4c9bdea890b039a56bd72c3910d8d753c36f3f10b132f52230bdde0b28243 Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.066672 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" event={"ID":"a0857db7-00e4-410c-b5a2-945a46ae175a","Type":"ContainerStarted","Data":"25a4c9bdea890b039a56bd72c3910d8d753c36f3f10b132f52230bdde0b28243"} Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.069081 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" event={"ID":"6171555b-a2ba-4177-b7d7-3bb5496a99bd","Type":"ContainerStarted","Data":"59311630791bcf547305af48b547747fdf2328d325839d00db6b6efd0360f6ce"} Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.485812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.486208 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.491319 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-metrics-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.492550 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/72061fb8-5546-4ced-ba4a-f7faeeebec85-webhook-certs\") pod \"openstack-operator-controller-manager-7b845677-nvxnd\" (UID: \"72061fb8-5546-4ced-ba4a-f7faeeebec85\") " pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:08 crc kubenswrapper[4776]: I1204 09:57:08.776701 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:13 crc kubenswrapper[4776]: I1204 09:57:13.219588 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd"] Dec 04 09:57:14 crc kubenswrapper[4776]: E1204 09:57:14.124195 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" podUID="25849bc1-46e2-4ff1-a61a-f0b7105290bf" Dec 04 09:57:14 crc kubenswrapper[4776]: E1204 09:57:14.124944 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" podUID="58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1" Dec 04 09:57:14 crc kubenswrapper[4776]: E1204 09:57:14.125896 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" podUID="725f674d-7785-4bb1-95d2-2a650b9f4df8" Dec 04 09:57:14 crc kubenswrapper[4776]: I1204 09:57:14.127170 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" event={"ID":"72061fb8-5546-4ced-ba4a-f7faeeebec85","Type":"ContainerStarted","Data":"cd8fdd92873260aae047db82b1179f052d1b66fcf94ffb93a41332198f2abe18"} Dec 04 09:57:14 crc kubenswrapper[4776]: E1204 09:57:14.361720 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" podUID="df5a8995-658c-4525-93ac-604d3c2af213" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.135584 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" event={"ID":"25849bc1-46e2-4ff1-a61a-f0b7105290bf","Type":"ContainerStarted","Data":"8d2e7ecc916062c5608890a3588eaf406400ce92659bb0a3c28031f75c054487"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.137882 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" event={"ID":"72061fb8-5546-4ced-ba4a-f7faeeebec85","Type":"ContainerStarted","Data":"88e520fde281b942ef62650f2993dbcc087b4410d14eb3637ae5325e7596eb17"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.139467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" event={"ID":"df5a8995-658c-4525-93ac-604d3c2af213","Type":"ContainerStarted","Data":"32093763d4cd6c30af81ec5cc7e4bf45dffe14bbcd683fffb1974994bc12140c"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.141300 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" event={"ID":"ec5e5439-8cfc-4e75-9627-45e4999aacea","Type":"ContainerStarted","Data":"f05be484d83a9f38e083873b2004b0d8e463cd087aec5b941faca595de0d2abe"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.143593 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" event={"ID":"ec5df28d-5944-43f3-bf28-12e1062b1060","Type":"ContainerStarted","Data":"58a9acefaa6ff5b02b5ff7f246bb63aa86c5270039cc42882976e3ddba479902"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.143672 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.145040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" event={"ID":"725f674d-7785-4bb1-95d2-2a650b9f4df8","Type":"ContainerStarted","Data":"9a873ec43fdac1a1d42fd56997cf067ded2997cc51c73d8b54e92d19cb252094"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.146485 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.146575 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" event={"ID":"58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1","Type":"ContainerStarted","Data":"3bacdc378fccbc85d8df99278edd4c55877b31e36484ad67579803f1917f731b"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.148197 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" event={"ID":"8f26eb91-a638-4ba9-9547-7bef2c5513c4","Type":"ContainerStarted","Data":"27f7a1b73bfc9bb219f2c01bd1a63154ae5606c63ae7e47a17440888cb6f62b5"} Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.148449 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.159987 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.198728 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" podStartSLOduration=39.19869067 podStartE2EDuration="39.19869067s" podCreationTimestamp="2025-12-04 09:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:57:15.193455435 +0000 UTC m=+1080.059935812" watchObservedRunningTime="2025-12-04 09:57:15.19869067 +0000 UTC m=+1080.065171047" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.223893 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pzjlr" podStartSLOduration=4.843903436 podStartE2EDuration="40.223870417s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.725381425 +0000 UTC m=+1043.591861802" lastFinishedPulling="2025-12-04 09:57:14.105348406 +0000 UTC m=+1078.971828783" observedRunningTime="2025-12-04 09:57:15.217344383 +0000 UTC m=+1080.083824760" watchObservedRunningTime="2025-12-04 09:57:15.223870417 +0000 UTC m=+1080.090350794" Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.225705 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 09:57:15 crc kubenswrapper[4776]: I1204 09:57:15.259147 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jtqt7" podStartSLOduration=2.650022803 podStartE2EDuration="40.259123589s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:36.556087331 +0000 UTC m=+1041.422567708" lastFinishedPulling="2025-12-04 09:57:14.165188117 +0000 UTC m=+1079.031668494" observedRunningTime="2025-12-04 09:57:15.253963418 +0000 UTC m=+1080.120443795" watchObservedRunningTime="2025-12-04 09:57:15.259123589 +0000 UTC m=+1080.125603966" Dec 04 09:57:16 crc kubenswrapper[4776]: I1204 09:57:16.155520 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:19 crc kubenswrapper[4776]: I1204 09:57:19.380298 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:57:19 crc kubenswrapper[4776]: I1204 09:57:19.380653 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:57:28 crc kubenswrapper[4776]: I1204 09:57:28.783463 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7b845677-nvxnd" Dec 04 09:57:28 crc kubenswrapper[4776]: E1204 09:57:28.901172 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7" Dec 04 09:57:28 crc kubenswrapper[4776]: E1204 09:57:28.901377 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg4fd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-57548d458d-fk6f5_openstack-operators(a0857db7-00e4-410c-b5a2-945a46ae175a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.641860 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" podUID="fe5ac80c-367a-489b-901e-76d872a26e4b" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.642777 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" podUID="17848cf1-eceb-4e3e-9e39-40a7e4507d6b" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.657397 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" podUID="23b5c3d3-b677-4440-b489-9e1811b722bb" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.698826 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" podUID="34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.850041 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" podUID="2ceaf037-5fce-4ef5-b273-724eb446e0af" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.850089 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" podUID="f6f8f6ca-820b-41e8-af0a-aa6b439a3dad" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.874468 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" podUID="eca2af80-0e84-4615-9bd7-a907029259e7" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.878819 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" podUID="a0857db7-00e4-410c-b5a2-945a46ae175a" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.944631 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" podUID="c0269b5f-db90-427e-933b-6221bcfbde9e" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.992344 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" podUID="115873e4-456f-4d60-84f0-182f467cb8c0" Dec 04 09:57:29 crc kubenswrapper[4776]: E1204 09:57:29.994135 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" podUID="50a0ede3-8c98-47c6-945e-6aeefa27f86e" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.259040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" event={"ID":"25849bc1-46e2-4ff1-a61a-f0b7105290bf","Type":"ContainerStarted","Data":"ed064668c34498cbbfd8d07810480f54e6709e3d667b96f3b3e28a8d2690759f"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.259206 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.261144 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" event={"ID":"615d312b-bd1f-40c3-b499-a7c4ae351cd3","Type":"ContainerStarted","Data":"02f0baecda629de497da8d1c513028987b569a7f49d60f5cb632dd4e61a2f3f5"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.263686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" event={"ID":"61813ce8-b03b-473b-9606-22515ab1de03","Type":"ContainerStarted","Data":"f245dbe5103fae0932335dd87c2c560824e10d1fa7baebf2339de82e43197304"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.263886 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.265869 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" event={"ID":"23b5c3d3-b677-4440-b489-9e1811b722bb","Type":"ContainerStarted","Data":"622be12acab804bd0a64aa3d8d47c8e2a25ae2fa85c649338b6adc608389e0c5"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.271346 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.271702 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" event={"ID":"6171555b-a2ba-4177-b7d7-3bb5496a99bd","Type":"ContainerStarted","Data":"6e8351d202a5d4e44f813d7b2f4e57850e1fd3fc826855ec506a6bdb4b8581cd"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.272249 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.274899 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" event={"ID":"115873e4-456f-4d60-84f0-182f467cb8c0","Type":"ContainerStarted","Data":"51b663720045008fb3eb42e752868e70dd83f9639a4a0674c17ec82c7026cc7a"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.277372 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" event={"ID":"a0857db7-00e4-410c-b5a2-945a46ae175a","Type":"ContainerStarted","Data":"5d78c01079a4c674bce3e2080b9a5cb67a64cbafbd1e9c4c41b053cc6858740a"} Dec 04 09:57:30 crc kubenswrapper[4776]: E1204 09:57:30.278678 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" podUID="a0857db7-00e4-410c-b5a2-945a46ae175a" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.279485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" event={"ID":"17848cf1-eceb-4e3e-9e39-40a7e4507d6b","Type":"ContainerStarted","Data":"b19a4ca34185e3962caf79aa36fc7b56433691a3161f8ee4539e7114cc205d88"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.281270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" event={"ID":"f6f8f6ca-820b-41e8-af0a-aa6b439a3dad","Type":"ContainerStarted","Data":"266b2ff7d6ddb0aeeb468b6f4f29b4d707a67de4f6a37ac4d614e9f62b7384b8"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.287631 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.287672 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" event={"ID":"c0269b5f-db90-427e-933b-6221bcfbde9e","Type":"ContainerStarted","Data":"ca601ac8b15fff8704608310f19c3ea2a1ef208ae196967c4129646f67cfeaed"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.291810 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" event={"ID":"34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe","Type":"ContainerStarted","Data":"e0dee69193090204267e8fe4ed76d7f7dd3e26093220e4a387558b1291ecf5d5"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.297533 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" event={"ID":"fe5ac80c-367a-489b-901e-76d872a26e4b","Type":"ContainerStarted","Data":"54bbf58c78d7ad1fa4ed4f039fafdb28116070eb2396881e7a91605ea0116931"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.300167 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" event={"ID":"eca2af80-0e84-4615-9bd7-a907029259e7","Type":"ContainerStarted","Data":"cb562084234ace72f4ed96770ca0a785c5accb5443fdb5880cad3c6bd9b39103"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.306173 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" podStartSLOduration=3.097937138 podStartE2EDuration="55.30614378s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:36.941943865 +0000 UTC m=+1041.808424242" lastFinishedPulling="2025-12-04 09:57:29.150150507 +0000 UTC m=+1094.016630884" observedRunningTime="2025-12-04 09:57:30.301440113 +0000 UTC m=+1095.167920490" watchObservedRunningTime="2025-12-04 09:57:30.30614378 +0000 UTC m=+1095.172624157" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.321532 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" event={"ID":"2ceaf037-5fce-4ef5-b273-724eb446e0af","Type":"ContainerStarted","Data":"cf6bdd4a7f939db999c6c15f7bcc9ef054669731ca47532f8ac035c3ad58c485"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.344295 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" event={"ID":"ec5e5439-8cfc-4e75-9627-45e4999aacea","Type":"ContainerStarted","Data":"3bdc9e867314017e8fe520233f37553b4561b0c7de4983cf2b264be207ee4356"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.345370 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.351256 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" event={"ID":"6bab5c22-f51d-4049-adb5-343a7195eeb7","Type":"ContainerStarted","Data":"a7d9a41e71d4a92942a54529c4a82a641af90e41c1cf1b9d933f53b21285edb1"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.352312 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.357940 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.358225 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.370409 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" event={"ID":"725f674d-7785-4bb1-95d2-2a650b9f4df8","Type":"ContainerStarted","Data":"b6546e836d780a16d6d2803d1c5ec765a08ce636bdd31e213d3d8ab5652cbdbd"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.371266 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.386269 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" event={"ID":"50a0ede3-8c98-47c6-945e-6aeefa27f86e","Type":"ContainerStarted","Data":"39f4c5c80ee1b6064711f25b2390bb005f433762a0880ebd82727c6ffa201c4d"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.395892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" event={"ID":"df5a8995-658c-4525-93ac-604d3c2af213","Type":"ContainerStarted","Data":"da12773b0067aa258f91a72f18d11fa63aeadf436039a05cb3ddeb58f8a3a287"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.395955 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.421395 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" event={"ID":"58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1","Type":"ContainerStarted","Data":"5a3ffbf8fc97e678440afec0ffd415bc2cc311065dc717adf274404834f1bf2f"} Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.422332 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.541574 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-8h27m" podStartSLOduration=2.946289695 podStartE2EDuration="55.54155394s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:36.55639391 +0000 UTC m=+1041.422874287" lastFinishedPulling="2025-12-04 09:57:29.151658155 +0000 UTC m=+1094.018138532" observedRunningTime="2025-12-04 09:57:30.538305368 +0000 UTC m=+1095.404785745" watchObservedRunningTime="2025-12-04 09:57:30.54155394 +0000 UTC m=+1095.408034317" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.671836 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s767z" podStartSLOduration=4.12996824 podStartE2EDuration="54.671809193s" podCreationTimestamp="2025-12-04 09:56:36 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.606295431 +0000 UTC m=+1043.472775808" lastFinishedPulling="2025-12-04 09:57:29.148136384 +0000 UTC m=+1094.014616761" observedRunningTime="2025-12-04 09:57:30.665514076 +0000 UTC m=+1095.531994463" watchObservedRunningTime="2025-12-04 09:57:30.671809193 +0000 UTC m=+1095.538289580" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.760950 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-zq7wg" podStartSLOduration=4.516506749 podStartE2EDuration="55.76093054s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.884050539 +0000 UTC m=+1042.750530916" lastFinishedPulling="2025-12-04 09:57:29.12847433 +0000 UTC m=+1093.994954707" observedRunningTime="2025-12-04 09:57:30.75743062 +0000 UTC m=+1095.623910997" watchObservedRunningTime="2025-12-04 09:57:30.76093054 +0000 UTC m=+1095.627410917" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.832547 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" podStartSLOduration=5.421326889 podStartE2EDuration="55.832518767s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.741066375 +0000 UTC m=+1043.607546752" lastFinishedPulling="2025-12-04 09:57:29.152258253 +0000 UTC m=+1094.018738630" observedRunningTime="2025-12-04 09:57:30.825899381 +0000 UTC m=+1095.692379768" watchObservedRunningTime="2025-12-04 09:57:30.832518767 +0000 UTC m=+1095.698999144" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.863831 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" podStartSLOduration=3.663881332 podStartE2EDuration="55.863810156s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:36.942212903 +0000 UTC m=+1041.808693280" lastFinishedPulling="2025-12-04 09:57:29.142141727 +0000 UTC m=+1094.008622104" observedRunningTime="2025-12-04 09:57:30.857575171 +0000 UTC m=+1095.724055558" watchObservedRunningTime="2025-12-04 09:57:30.863810156 +0000 UTC m=+1095.730290533" Dec 04 09:57:30 crc kubenswrapper[4776]: I1204 09:57:30.931343 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj" podStartSLOduration=48.346199795 podStartE2EDuration="55.931327917s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:57:06.703991789 +0000 UTC m=+1071.570472166" lastFinishedPulling="2025-12-04 09:57:14.289119911 +0000 UTC m=+1079.155600288" observedRunningTime="2025-12-04 09:57:30.928878111 +0000 UTC m=+1095.795358488" watchObservedRunningTime="2025-12-04 09:57:30.931327917 +0000 UTC m=+1095.797808294" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.040782 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4c2d9" podStartSLOduration=5.482273075 podStartE2EDuration="56.040764799s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.595245695 +0000 UTC m=+1043.461726072" lastFinishedPulling="2025-12-04 09:57:29.153737419 +0000 UTC m=+1094.020217796" observedRunningTime="2025-12-04 09:57:31.010237954 +0000 UTC m=+1095.876718351" watchObservedRunningTime="2025-12-04 09:57:31.040764799 +0000 UTC m=+1095.907245176" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.043409 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" podStartSLOduration=4.777845741 podStartE2EDuration="56.043397151s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.884573106 +0000 UTC m=+1042.751053473" lastFinishedPulling="2025-12-04 09:57:29.150124516 +0000 UTC m=+1094.016604883" observedRunningTime="2025-12-04 09:57:31.037183707 +0000 UTC m=+1095.903664084" watchObservedRunningTime="2025-12-04 09:57:31.043397151 +0000 UTC m=+1095.909877528" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.443061 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" event={"ID":"23b5c3d3-b677-4440-b489-9e1811b722bb","Type":"ContainerStarted","Data":"86572e3d605b57ea28623b095d84b6bf3ee9fd89edbf69984f2b6ce5de87ded1"} Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.443135 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.447135 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.463468 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" podStartSLOduration=3.418856721 podStartE2EDuration="56.463450324s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.842819621 +0000 UTC m=+1042.709300008" lastFinishedPulling="2025-12-04 09:57:30.887413234 +0000 UTC m=+1095.753893611" observedRunningTime="2025-12-04 09:57:31.463130004 +0000 UTC m=+1096.329610381" watchObservedRunningTime="2025-12-04 09:57:31.463450324 +0000 UTC m=+1096.329930701" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.480087 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" podStartSLOduration=3.482357795 podStartE2EDuration="56.480070474s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.884047849 +0000 UTC m=+1042.750528226" lastFinishedPulling="2025-12-04 09:57:30.881760518 +0000 UTC m=+1095.748240905" observedRunningTime="2025-12-04 09:57:31.477013818 +0000 UTC m=+1096.343494195" watchObservedRunningTime="2025-12-04 09:57:31.480070474 +0000 UTC m=+1096.346550851" Dec 04 09:57:31 crc kubenswrapper[4776]: E1204 09:57:31.485933 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" podUID="a0857db7-00e4-410c-b5a2-945a46ae175a" Dec 04 09:57:31 crc kubenswrapper[4776]: I1204 09:57:31.518523 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" podStartSLOduration=4.219341258 podStartE2EDuration="56.518506875s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.636594618 +0000 UTC m=+1043.503074995" lastFinishedPulling="2025-12-04 09:57:30.935760235 +0000 UTC m=+1095.802240612" observedRunningTime="2025-12-04 09:57:31.517431212 +0000 UTC m=+1096.383911589" watchObservedRunningTime="2025-12-04 09:57:31.518506875 +0000 UTC m=+1096.384987252" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.484656 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" event={"ID":"17848cf1-eceb-4e3e-9e39-40a7e4507d6b","Type":"ContainerStarted","Data":"8594815c2a0c2b1396447faf0a3b3b9355f82a8edbbc3cf3e50e80a5f10dfc89"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.485043 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.486599 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" event={"ID":"f6f8f6ca-820b-41e8-af0a-aa6b439a3dad","Type":"ContainerStarted","Data":"0a5f956ef9e50b9b4eeb64c11179924362572f9810e6cb755887df2f4f9b412f"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.486895 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.488311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" event={"ID":"50a0ede3-8c98-47c6-945e-6aeefa27f86e","Type":"ContainerStarted","Data":"88ef4200c034ac99111e4ec4b507fb9e1e5ab06cb95af90f87d518a924f5698a"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.488454 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.489523 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" event={"ID":"eca2af80-0e84-4615-9bd7-a907029259e7","Type":"ContainerStarted","Data":"bd0cb4cba0d7953fe2e92fe7f300a072a2209b20e3657592527c26e07428f673"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.490978 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" event={"ID":"115873e4-456f-4d60-84f0-182f467cb8c0","Type":"ContainerStarted","Data":"36504189df587adb9fca7e57109e5ef0d10bed54dae138decedd549cadee34ed"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.491094 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.492762 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" event={"ID":"c0269b5f-db90-427e-933b-6221bcfbde9e","Type":"ContainerStarted","Data":"4402f060d390c918296c1e30fd115745c3a72be3ad0873bfd0ceb9d0ff85c548"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.492879 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.495368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" event={"ID":"34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe","Type":"ContainerStarted","Data":"2cab05d9e849f9112ca792e0e4c48d280b845ba159a7a6fcab2d1bb7f7fd7454"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.495577 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.497140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" event={"ID":"fe5ac80c-367a-489b-901e-76d872a26e4b","Type":"ContainerStarted","Data":"efdad1adc50925e605bd78f64c52713e9b59cbb6280384e1ce96582319ace0fb"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.498057 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.501218 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" event={"ID":"2ceaf037-5fce-4ef5-b273-724eb446e0af","Type":"ContainerStarted","Data":"197fc248d83bddcf0fa52dba18b11c97dc321c4619d7157bfd9583f284f9e98e"} Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.501396 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.550451 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" podStartSLOduration=5.176513913 podStartE2EDuration="57.550428179s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.56407021 +0000 UTC m=+1043.430550587" lastFinishedPulling="2025-12-04 09:57:30.937984466 +0000 UTC m=+1095.804464853" observedRunningTime="2025-12-04 09:57:32.545775313 +0000 UTC m=+1097.412255690" watchObservedRunningTime="2025-12-04 09:57:32.550428179 +0000 UTC m=+1097.416908556" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.574350 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" podStartSLOduration=4.082676646 podStartE2EDuration="57.574330726s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.829632319 +0000 UTC m=+1042.696112696" lastFinishedPulling="2025-12-04 09:57:31.321286399 +0000 UTC m=+1096.187766776" observedRunningTime="2025-12-04 09:57:32.572183009 +0000 UTC m=+1097.438663386" watchObservedRunningTime="2025-12-04 09:57:32.574330726 +0000 UTC m=+1097.440811103" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.618883 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" podStartSLOduration=4.370177483 podStartE2EDuration="57.618864068s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.88407424 +0000 UTC m=+1042.750554617" lastFinishedPulling="2025-12-04 09:57:31.132760825 +0000 UTC m=+1095.999241202" observedRunningTime="2025-12-04 09:57:32.613115669 +0000 UTC m=+1097.479596066" watchObservedRunningTime="2025-12-04 09:57:32.618864068 +0000 UTC m=+1097.485344445" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.632244 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" podStartSLOduration=3.757141388 podStartE2EDuration="57.632225726s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:37.182396133 +0000 UTC m=+1042.048876510" lastFinishedPulling="2025-12-04 09:57:31.057480471 +0000 UTC m=+1095.923960848" observedRunningTime="2025-12-04 09:57:32.629726668 +0000 UTC m=+1097.496207055" watchObservedRunningTime="2025-12-04 09:57:32.632225726 +0000 UTC m=+1097.498706103" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.660319 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" podStartSLOduration=3.50037093 podStartE2EDuration="57.660293884s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:36.875134216 +0000 UTC m=+1041.741614593" lastFinishedPulling="2025-12-04 09:57:31.03505717 +0000 UTC m=+1095.901537547" observedRunningTime="2025-12-04 09:57:32.654400459 +0000 UTC m=+1097.520880846" watchObservedRunningTime="2025-12-04 09:57:32.660293884 +0000 UTC m=+1097.526774261" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.699600 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" podStartSLOduration=5.300629095 podStartE2EDuration="57.699580412s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.538585194 +0000 UTC m=+1043.405065571" lastFinishedPulling="2025-12-04 09:57:30.937536511 +0000 UTC m=+1095.804016888" observedRunningTime="2025-12-04 09:57:32.696146775 +0000 UTC m=+1097.562627162" watchObservedRunningTime="2025-12-04 09:57:32.699580412 +0000 UTC m=+1097.566060789" Dec 04 09:57:32 crc kubenswrapper[4776]: I1204 09:57:32.745715 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" podStartSLOduration=5.445572267 podStartE2EDuration="57.745682604s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:56:38.72429739 +0000 UTC m=+1043.590777777" lastFinishedPulling="2025-12-04 09:57:31.024407737 +0000 UTC m=+1095.890888114" observedRunningTime="2025-12-04 09:57:32.745637693 +0000 UTC m=+1097.612118090" watchObservedRunningTime="2025-12-04 09:57:32.745682604 +0000 UTC m=+1097.612162981" Dec 04 09:57:35 crc kubenswrapper[4776]: I1204 09:57:35.662438 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-z8q57" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.052952 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-ldf84" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.081815 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-x9jlc" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.089800 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-z6kf6" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.395078 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-79d898f8f7-lbtlb" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.406362 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-5l4h4" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.448805 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hscd7" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.517057 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4d8fg" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.568053 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-ft7rc" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.568105 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8wwhc" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.596788 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-mkn8n" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.599901 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mlnr6" Dec 04 09:57:36 crc kubenswrapper[4776]: I1204 09:57:36.613224 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wbcs6" Dec 04 09:57:45 crc kubenswrapper[4776]: I1204 09:57:45.687711 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-d7zhq" Dec 04 09:57:46 crc kubenswrapper[4776]: I1204 09:57:46.621725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" event={"ID":"a0857db7-00e4-410c-b5a2-945a46ae175a","Type":"ContainerStarted","Data":"70a8f158e6d70df0594a8bf6b0f511f37b037ddb2dfdfbaad13ab255abc517b6"} Dec 04 09:57:46 crc kubenswrapper[4776]: I1204 09:57:46.621973 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:57:46 crc kubenswrapper[4776]: I1204 09:57:46.638827 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" podStartSLOduration=33.543469782 podStartE2EDuration="1m11.638807247s" podCreationTimestamp="2025-12-04 09:56:35 +0000 UTC" firstStartedPulling="2025-12-04 09:57:07.430317947 +0000 UTC m=+1072.296798324" lastFinishedPulling="2025-12-04 09:57:45.525655412 +0000 UTC m=+1110.392135789" observedRunningTime="2025-12-04 09:57:46.635787613 +0000 UTC m=+1111.502268020" watchObservedRunningTime="2025-12-04 09:57:46.638807247 +0000 UTC m=+1111.505287624" Dec 04 09:57:49 crc kubenswrapper[4776]: I1204 09:57:49.379313 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:57:49 crc kubenswrapper[4776]: I1204 09:57:49.379659 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:57:51 crc kubenswrapper[4776]: I1204 09:57:51.850206 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fk6f5" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.368236 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8svng"] Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.370834 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.375202 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.375555 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.375857 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.376325 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9mdhc" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.387386 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8svng"] Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.487017 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pzxpv"] Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.488410 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.493213 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.527968 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pzxpv"] Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.535998 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec13a2bf-8612-4605-a179-dcce0a2dfe06-config\") pod \"dnsmasq-dns-675f4bcbfc-8svng\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.536108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ghtz\" (UniqueName: \"kubernetes.io/projected/ec13a2bf-8612-4605-a179-dcce0a2dfe06-kube-api-access-4ghtz\") pod \"dnsmasq-dns-675f4bcbfc-8svng\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.637745 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-config\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.637815 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec13a2bf-8612-4605-a179-dcce0a2dfe06-config\") pod \"dnsmasq-dns-675f4bcbfc-8svng\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.637848 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvwd\" (UniqueName: \"kubernetes.io/projected/0b66163a-a528-44d4-9e9d-4eebd6969c60-kube-api-access-fnvwd\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.637898 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ghtz\" (UniqueName: \"kubernetes.io/projected/ec13a2bf-8612-4605-a179-dcce0a2dfe06-kube-api-access-4ghtz\") pod \"dnsmasq-dns-675f4bcbfc-8svng\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.637930 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.639140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec13a2bf-8612-4605-a179-dcce0a2dfe06-config\") pod \"dnsmasq-dns-675f4bcbfc-8svng\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.672901 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ghtz\" (UniqueName: \"kubernetes.io/projected/ec13a2bf-8612-4605-a179-dcce0a2dfe06-kube-api-access-4ghtz\") pod \"dnsmasq-dns-675f4bcbfc-8svng\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.698075 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.740041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-config\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.740127 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvwd\" (UniqueName: \"kubernetes.io/projected/0b66163a-a528-44d4-9e9d-4eebd6969c60-kube-api-access-fnvwd\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.740183 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.741075 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-config\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.741411 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.765065 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvwd\" (UniqueName: \"kubernetes.io/projected/0b66163a-a528-44d4-9e9d-4eebd6969c60-kube-api-access-fnvwd\") pod \"dnsmasq-dns-78dd6ddcc-pzxpv\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:06 crc kubenswrapper[4776]: I1204 09:58:06.815311 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:07 crc kubenswrapper[4776]: I1204 09:58:07.230307 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8svng"] Dec 04 09:58:07 crc kubenswrapper[4776]: I1204 09:58:07.358755 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pzxpv"] Dec 04 09:58:07 crc kubenswrapper[4776]: W1204 09:58:07.363100 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b66163a_a528_44d4_9e9d_4eebd6969c60.slice/crio-1830141efa4b9064796ce79225ef66339c1b22f7115e82cafa999e8c7b7e6604 WatchSource:0}: Error finding container 1830141efa4b9064796ce79225ef66339c1b22f7115e82cafa999e8c7b7e6604: Status 404 returned error can't find the container with id 1830141efa4b9064796ce79225ef66339c1b22f7115e82cafa999e8c7b7e6604 Dec 04 09:58:07 crc kubenswrapper[4776]: I1204 09:58:07.782519 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" event={"ID":"0b66163a-a528-44d4-9e9d-4eebd6969c60","Type":"ContainerStarted","Data":"1830141efa4b9064796ce79225ef66339c1b22f7115e82cafa999e8c7b7e6604"} Dec 04 09:58:07 crc kubenswrapper[4776]: I1204 09:58:07.783634 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" event={"ID":"ec13a2bf-8612-4605-a179-dcce0a2dfe06","Type":"ContainerStarted","Data":"720f927b351847761560af09d5fb29a65d7d3017d04e86dd1201c6778a2d94ec"} Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.498179 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8svng"] Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.530206 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sq9zf"] Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.532006 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.556854 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sq9zf"] Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.667587 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhrh\" (UniqueName: \"kubernetes.io/projected/0b689a22-b93d-4fd7-80ac-6593ade4066a-kube-api-access-kfhrh\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.667696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.667731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-config\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.773505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.773563 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-config\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.773742 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhrh\" (UniqueName: \"kubernetes.io/projected/0b689a22-b93d-4fd7-80ac-6593ade4066a-kube-api-access-kfhrh\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.779030 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-config\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.780973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-dns-svc\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.813349 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhrh\" (UniqueName: \"kubernetes.io/projected/0b689a22-b93d-4fd7-80ac-6593ade4066a-kube-api-access-kfhrh\") pod \"dnsmasq-dns-666b6646f7-sq9zf\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.860100 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.862470 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pzxpv"] Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.905121 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lhh6z"] Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.907795 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.915488 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lhh6z"] Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.980734 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.980843 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-config\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:09 crc kubenswrapper[4776]: I1204 09:58:09.980871 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq2lg\" (UniqueName: \"kubernetes.io/projected/a246b792-5a73-4f08-9373-8a719864bd7d-kube-api-access-cq2lg\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.083217 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-config\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.083874 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq2lg\" (UniqueName: \"kubernetes.io/projected/a246b792-5a73-4f08-9373-8a719864bd7d-kube-api-access-cq2lg\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.083985 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.084199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-config\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.084855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.116676 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq2lg\" (UniqueName: \"kubernetes.io/projected/a246b792-5a73-4f08-9373-8a719864bd7d-kube-api-access-cq2lg\") pod \"dnsmasq-dns-57d769cc4f-lhh6z\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.310184 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.492093 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sq9zf"] Dec 04 09:58:10 crc kubenswrapper[4776]: W1204 09:58:10.504009 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b689a22_b93d_4fd7_80ac_6593ade4066a.slice/crio-8f0f26684ea4ef1bc9304ddb05da6326d7c74379708ae6aeb6c1953836930481 WatchSource:0}: Error finding container 8f0f26684ea4ef1bc9304ddb05da6326d7c74379708ae6aeb6c1953836930481: Status 404 returned error can't find the container with id 8f0f26684ea4ef1bc9304ddb05da6326d7c74379708ae6aeb6c1953836930481 Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.688461 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.690340 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.697860 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.698115 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.698247 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.698384 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.698382 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.698657 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.699415 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jxvm9" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.709257 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798365 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798453 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798515 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798553 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798605 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798693 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798786 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798885 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798949 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65d6s\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-kube-api-access-65d6s\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.798967 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.799116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.830079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" event={"ID":"0b689a22-b93d-4fd7-80ac-6593ade4066a","Type":"ContainerStarted","Data":"8f0f26684ea4ef1bc9304ddb05da6326d7c74379708ae6aeb6c1953836930481"} Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.901002 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.901099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.902829 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.902895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.902964 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.903051 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.903122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65d6s\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-kube-api-access-65d6s\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.903168 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.904204 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.904235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.904254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.904326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.904373 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.905795 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-config-data\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.907429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.910049 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lhh6z"] Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.914651 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.915650 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.917378 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.917468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.917893 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.923514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.924222 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65d6s\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-kube-api-access-65d6s\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:10 crc kubenswrapper[4776]: I1204 09:58:10.948660 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " pod="openstack/rabbitmq-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.011086 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.012538 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.015114 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.015674 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.016027 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.017460 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.017652 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.021349 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.021824 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.029739 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ng7tx" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.048401 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106439 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106489 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106625 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106659 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2nln\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-kube-api-access-c2nln\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106678 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106716 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106756 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106774 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106836 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106899 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.106958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208540 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208620 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208649 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208681 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208740 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208829 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208856 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.208894 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.209011 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2nln\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-kube-api-access-c2nln\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.209032 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.209192 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.209713 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.214146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.214953 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.216630 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.217031 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.219088 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.220533 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.222520 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.229825 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.246968 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.253545 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2nln\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-kube-api-access-c2nln\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.401651 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.859386 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" event={"ID":"a246b792-5a73-4f08-9373-8a719864bd7d","Type":"ContainerStarted","Data":"8f1a03e4760b4786145c8cdeaab339cef724db480e092e4ac44f6a53cb0a59a6"} Dec 04 09:58:11 crc kubenswrapper[4776]: I1204 09:58:11.882419 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 09:58:11 crc kubenswrapper[4776]: W1204 09:58:11.937025 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ba805ac_f1c5_4049_bb56_7dfff0ccc76c.slice/crio-d2a6fd28d05f34761d54c8f7c42b167f3b2a5714d40961f574d3c690bc14b951 WatchSource:0}: Error finding container d2a6fd28d05f34761d54c8f7c42b167f3b2a5714d40961f574d3c690bc14b951: Status 404 returned error can't find the container with id d2a6fd28d05f34761d54c8f7c42b167f3b2a5714d40961f574d3c690bc14b951 Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.066161 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.352069 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.353470 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.359249 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.359787 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.360035 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.360322 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.360889 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kj4dq" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.363951 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538551 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-config-data-default\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538629 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b38109cb-9fe9-429d-b580-999d6978f536-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538656 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38109cb-9fe9-429d-b580-999d6978f536-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b38109cb-9fe9-429d-b580-999d6978f536-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-kolla-config\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538749 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.538965 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.539028 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpd6j\" (UniqueName: \"kubernetes.io/projected/b38109cb-9fe9-429d-b580-999d6978f536-kube-api-access-hpd6j\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645173 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-config-data-default\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645273 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b38109cb-9fe9-429d-b580-999d6978f536-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645358 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38109cb-9fe9-429d-b580-999d6978f536-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645393 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b38109cb-9fe9-429d-b580-999d6978f536-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645436 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-kolla-config\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645455 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645498 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.645521 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpd6j\" (UniqueName: \"kubernetes.io/projected/b38109cb-9fe9-429d-b580-999d6978f536-kube-api-access-hpd6j\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.646401 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-config-data-default\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.646476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b38109cb-9fe9-429d-b580-999d6978f536-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.646973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-kolla-config\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.647352 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.648269 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b38109cb-9fe9-429d-b580-999d6978f536-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.656204 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b38109cb-9fe9-429d-b580-999d6978f536-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.658658 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38109cb-9fe9-429d-b580-999d6978f536-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.668675 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpd6j\" (UniqueName: \"kubernetes.io/projected/b38109cb-9fe9-429d-b580-999d6978f536-kube-api-access-hpd6j\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.692302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b38109cb-9fe9-429d-b580-999d6978f536\") " pod="openstack/openstack-galera-0" Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.876534 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c","Type":"ContainerStarted","Data":"d2a6fd28d05f34761d54c8f7c42b167f3b2a5714d40961f574d3c690bc14b951"} Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.878955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e","Type":"ContainerStarted","Data":"fe98d8f101b49fc76afc5392f96b7b9cfc8995764ca7b4d72002dad3d6a9117f"} Dec 04 09:58:12 crc kubenswrapper[4776]: I1204 09:58:12.993637 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.515574 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.520450 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.524841 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.525487 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.525544 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.525735 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pxsx5" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.526209 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.631881 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.632960 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.636540 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.636678 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.638781 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2t9hw" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.652084 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668278 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668522 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0c172a-45d2-4fab-940c-f343c9e227fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668555 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0c172a-45d2-4fab-940c-f343c9e227fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668600 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569zq\" (UniqueName: \"kubernetes.io/projected/be0c172a-45d2-4fab-940c-f343c9e227fc-kube-api-access-569zq\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668626 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be0c172a-45d2-4fab-940c-f343c9e227fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.668675 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.771888 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1597214-7e53-46e4-8ba2-3732fc1ebf29-kolla-config\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.772013 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1597214-7e53-46e4-8ba2-3732fc1ebf29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.772056 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0c172a-45d2-4fab-940c-f343c9e227fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.772078 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0c172a-45d2-4fab-940c-f343c9e227fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.772101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569zq\" (UniqueName: \"kubernetes.io/projected/be0c172a-45d2-4fab-940c-f343c9e227fc-kube-api-access-569zq\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.772124 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zhmg\" (UniqueName: \"kubernetes.io/projected/c1597214-7e53-46e4-8ba2-3732fc1ebf29-kube-api-access-9zhmg\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773107 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be0c172a-45d2-4fab-940c-f343c9e227fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773212 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1597214-7e53-46e4-8ba2-3732fc1ebf29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773283 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773391 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773461 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1597214-7e53-46e4-8ba2-3732fc1ebf29-config-data\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.773545 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be0c172a-45d2-4fab-940c-f343c9e227fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.774191 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.777318 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.778863 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.783322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be0c172a-45d2-4fab-940c-f343c9e227fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.792802 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0c172a-45d2-4fab-940c-f343c9e227fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.796744 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0c172a-45d2-4fab-940c-f343c9e227fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.800111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569zq\" (UniqueName: \"kubernetes.io/projected/be0c172a-45d2-4fab-940c-f343c9e227fc-kube-api-access-569zq\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.821065 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"be0c172a-45d2-4fab-940c-f343c9e227fc\") " pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.861887 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.875801 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1597214-7e53-46e4-8ba2-3732fc1ebf29-config-data\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.875943 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1597214-7e53-46e4-8ba2-3732fc1ebf29-kolla-config\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.875968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1597214-7e53-46e4-8ba2-3732fc1ebf29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.875997 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zhmg\" (UniqueName: \"kubernetes.io/projected/c1597214-7e53-46e4-8ba2-3732fc1ebf29-kube-api-access-9zhmg\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.876027 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1597214-7e53-46e4-8ba2-3732fc1ebf29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.877801 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1597214-7e53-46e4-8ba2-3732fc1ebf29-kolla-config\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.880039 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1597214-7e53-46e4-8ba2-3732fc1ebf29-config-data\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.886849 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1597214-7e53-46e4-8ba2-3732fc1ebf29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.887068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1597214-7e53-46e4-8ba2-3732fc1ebf29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.902062 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zhmg\" (UniqueName: \"kubernetes.io/projected/c1597214-7e53-46e4-8ba2-3732fc1ebf29-kube-api-access-9zhmg\") pod \"memcached-0\" (UID: \"c1597214-7e53-46e4-8ba2-3732fc1ebf29\") " pod="openstack/memcached-0" Dec 04 09:58:13 crc kubenswrapper[4776]: I1204 09:58:13.956298 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.628881 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.630136 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.633267 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2gltg" Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.660401 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.723399 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvcsc\" (UniqueName: \"kubernetes.io/projected/f57cb5d1-1baa-4fc7-8c71-16d1138dab82-kube-api-access-lvcsc\") pod \"kube-state-metrics-0\" (UID: \"f57cb5d1-1baa-4fc7-8c71-16d1138dab82\") " pod="openstack/kube-state-metrics-0" Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.824659 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvcsc\" (UniqueName: \"kubernetes.io/projected/f57cb5d1-1baa-4fc7-8c71-16d1138dab82-kube-api-access-lvcsc\") pod \"kube-state-metrics-0\" (UID: \"f57cb5d1-1baa-4fc7-8c71-16d1138dab82\") " pod="openstack/kube-state-metrics-0" Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.869132 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvcsc\" (UniqueName: \"kubernetes.io/projected/f57cb5d1-1baa-4fc7-8c71-16d1138dab82-kube-api-access-lvcsc\") pod \"kube-state-metrics-0\" (UID: \"f57cb5d1-1baa-4fc7-8c71-16d1138dab82\") " pod="openstack/kube-state-metrics-0" Dec 04 09:58:15 crc kubenswrapper[4776]: I1204 09:58:15.956116 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.311840 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tchdq"] Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.314652 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.317802 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.318711 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pghrh" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.318772 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.330087 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tchdq"] Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.370268 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9mnct"] Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.372588 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.379657 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-log-ovn\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.379735 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-run-ovn\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.379766 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-run\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.379809 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1100839e-9cfb-4361-a653-321d0d431072-combined-ca-bundle\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.379844 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrvrl\" (UniqueName: \"kubernetes.io/projected/1100839e-9cfb-4361-a653-321d0d431072-kube-api-access-zrvrl\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.379905 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1100839e-9cfb-4361-a653-321d0d431072-scripts\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.380202 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1100839e-9cfb-4361-a653-321d0d431072-ovn-controller-tls-certs\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.380458 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.380523 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.380583 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.381204 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e01a20d48aa8f7249b057929edbda0928b81534859b7bbd6d1f1ff0ee5da05c8"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.381287 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://e01a20d48aa8f7249b057929edbda0928b81534859b7bbd6d1f1ff0ee5da05c8" gracePeriod=600 Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.391583 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9mnct"] Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-run-ovn\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-run\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481287 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1100839e-9cfb-4361-a653-321d0d431072-combined-ca-bundle\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481318 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrvrl\" (UniqueName: \"kubernetes.io/projected/1100839e-9cfb-4361-a653-321d0d431072-kube-api-access-zrvrl\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481343 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1100839e-9cfb-4361-a653-321d0d431072-scripts\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-lib\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fch6z\" (UniqueName: \"kubernetes.io/projected/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-kube-api-access-fch6z\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481442 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-etc-ovs\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481466 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1100839e-9cfb-4361-a653-321d0d431072-ovn-controller-tls-certs\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481480 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-scripts\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481503 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-log-ovn\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481520 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-run\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.481546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-log\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.483107 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-run\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.483189 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-log-ovn\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.483199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1100839e-9cfb-4361-a653-321d0d431072-var-run-ovn\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.487779 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1100839e-9cfb-4361-a653-321d0d431072-combined-ca-bundle\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.488767 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1100839e-9cfb-4361-a653-321d0d431072-ovn-controller-tls-certs\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.503862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrvrl\" (UniqueName: \"kubernetes.io/projected/1100839e-9cfb-4361-a653-321d0d431072-kube-api-access-zrvrl\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.583904 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-lib\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584007 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fch6z\" (UniqueName: \"kubernetes.io/projected/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-kube-api-access-fch6z\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584046 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-etc-ovs\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584094 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-scripts\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584467 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-etc-ovs\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584519 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-run\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-lib\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584555 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-log\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584540 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-run\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.584735 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-var-log\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.586671 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-scripts\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.600778 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fch6z\" (UniqueName: \"kubernetes.io/projected/bd5ed17f-c4f4-4b17-b14c-d8717fc116f6-kube-api-access-fch6z\") pod \"ovn-controller-ovs-9mnct\" (UID: \"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6\") " pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.693634 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.858965 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1100839e-9cfb-4361-a653-321d0d431072-scripts\") pod \"ovn-controller-tchdq\" (UID: \"1100839e-9cfb-4361-a653-321d0d431072\") " pod="openstack/ovn-controller-tchdq" Dec 04 09:58:19 crc kubenswrapper[4776]: I1204 09:58:19.979073 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq" Dec 04 09:58:21 crc kubenswrapper[4776]: I1204 09:58:21.059683 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="e01a20d48aa8f7249b057929edbda0928b81534859b7bbd6d1f1ff0ee5da05c8" exitCode=0 Dec 04 09:58:21 crc kubenswrapper[4776]: I1204 09:58:21.059777 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"e01a20d48aa8f7249b057929edbda0928b81534859b7bbd6d1f1ff0ee5da05c8"} Dec 04 09:58:21 crc kubenswrapper[4776]: I1204 09:58:21.060077 4776 scope.go:117] "RemoveContainer" containerID="7dee382b67ae6de15878aafacfd524a1e7ecdaa0880997ede900fe467e79e6d0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.505791 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.507477 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.511254 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.511469 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.511488 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.511725 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.517423 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.529707 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-m8m7s" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.661037 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.662677 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.670582 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.671715 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-49rxs" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.671967 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.673286 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.693898 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714368 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c35b05be-3fec-4a42-af88-c80ad4c6833e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714456 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c35b05be-3fec-4a42-af88-c80ad4c6833e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714502 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714533 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714549 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714574 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714619 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35b05be-3fec-4a42-af88-c80ad4c6833e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.714662 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv96p\" (UniqueName: \"kubernetes.io/projected/c35b05be-3fec-4a42-af88-c80ad4c6833e-kube-api-access-rv96p\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817652 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817724 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c35b05be-3fec-4a42-af88-c80ad4c6833e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817753 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817774 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817812 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqg9\" (UniqueName: \"kubernetes.io/projected/2d1e14cd-4110-4ed1-9884-1318d980a844-kube-api-access-9sqg9\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817835 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1e14cd-4110-4ed1-9884-1318d980a844-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817873 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1e14cd-4110-4ed1-9884-1318d980a844-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817899 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c35b05be-3fec-4a42-af88-c80ad4c6833e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817947 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.817986 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818015 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35b05be-3fec-4a42-af88-c80ad4c6833e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818081 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d1e14cd-4110-4ed1-9884-1318d980a844-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv96p\" (UniqueName: \"kubernetes.io/projected/c35b05be-3fec-4a42-af88-c80ad4c6833e-kube-api-access-rv96p\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.818392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c35b05be-3fec-4a42-af88-c80ad4c6833e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.819649 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.820027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35b05be-3fec-4a42-af88-c80ad4c6833e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.820603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c35b05be-3fec-4a42-af88-c80ad4c6833e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.828178 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.828064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.830970 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c35b05be-3fec-4a42-af88-c80ad4c6833e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.839786 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv96p\" (UniqueName: \"kubernetes.io/projected/c35b05be-3fec-4a42-af88-c80ad4c6833e-kube-api-access-rv96p\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.852786 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c35b05be-3fec-4a42-af88-c80ad4c6833e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqg9\" (UniqueName: \"kubernetes.io/projected/2d1e14cd-4110-4ed1-9884-1318d980a844-kube-api-access-9sqg9\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991299 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1e14cd-4110-4ed1-9884-1318d980a844-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991341 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1e14cd-4110-4ed1-9884-1318d980a844-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991376 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991426 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991462 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d1e14cd-4110-4ed1-9884-1318d980a844-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.991525 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.992786 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1e14cd-4110-4ed1-9884-1318d980a844-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.993128 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.993288 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d1e14cd-4110-4ed1-9884-1318d980a844-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:22 crc kubenswrapper[4776]: I1204 09:58:22.993834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1e14cd-4110-4ed1-9884-1318d980a844-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.000038 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.002707 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.004846 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1e14cd-4110-4ed1-9884-1318d980a844-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.023883 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqg9\" (UniqueName: \"kubernetes.io/projected/2d1e14cd-4110-4ed1-9884-1318d980a844-kube-api-access-9sqg9\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.036143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d1e14cd-4110-4ed1-9884-1318d980a844\") " pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.128557 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:23 crc kubenswrapper[4776]: I1204 09:58:23.295108 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:32 crc kubenswrapper[4776]: E1204 09:58:32.643975 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:af9990558bda9e3f530d5f006eb13cc1e4a3ef9d3508dd9afdc91ab62586cfcd: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-rabbitmq/blobs/sha256:af9990558bda9e3f530d5f006eb13cc1e4a3ef9d3508dd9afdc91ab62586cfcd\": context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 09:58:32 crc kubenswrapper[4776]: E1204 09:58:32.645155 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65d6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(1b1b8bd1-3c18-4127-bb66-a3f99b106b8e): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:af9990558bda9e3f530d5f006eb13cc1e4a3ef9d3508dd9afdc91ab62586cfcd: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-rabbitmq/blobs/sha256:af9990558bda9e3f530d5f006eb13cc1e4a3ef9d3508dd9afdc91ab62586cfcd\": context canceled" logger="UnhandledError" Dec 04 09:58:32 crc kubenswrapper[4776]: E1204 09:58:32.646764 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:af9990558bda9e3f530d5f006eb13cc1e4a3ef9d3508dd9afdc91ab62586cfcd: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-rabbitmq/blobs/sha256:af9990558bda9e3f530d5f006eb13cc1e4a3ef9d3508dd9afdc91ab62586cfcd\\\": context canceled\"" pod="openstack/rabbitmq-server-0" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" Dec 04 09:58:33 crc kubenswrapper[4776]: E1204 09:58:33.206726 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" Dec 04 09:58:33 crc kubenswrapper[4776]: E1204 09:58:33.600246 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 09:58:33 crc kubenswrapper[4776]: E1204 09:58:33.600417 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cq2lg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-lhh6z_openstack(a246b792-5a73-4f08-9373-8a719864bd7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:58:33 crc kubenswrapper[4776]: E1204 09:58:33.601703 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.213245 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.777604 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.778158 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ghtz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-8svng_openstack(ec13a2bf-8612-4605-a179-dcce0a2dfe06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.779427 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" podUID="ec13a2bf-8612-4605-a179-dcce0a2dfe06" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.789525 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.789716 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kfhrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-sq9zf_openstack(0b689a22-b93d-4fd7-80ac-6593ade4066a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.791144 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.804675 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.804851 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fnvwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-pzxpv_openstack(0b66163a-a528-44d4-9e9d-4eebd6969c60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 09:58:34 crc kubenswrapper[4776]: E1204 09:58:34.806278 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" podUID="0b66163a-a528-44d4-9e9d-4eebd6969c60" Dec 04 09:58:35 crc kubenswrapper[4776]: E1204 09:58:35.272709 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.042029 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 09:58:36 crc kubenswrapper[4776]: W1204 09:58:36.045850 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1597214_7e53_46e4_8ba2_3732fc1ebf29.slice/crio-72fb579195803ce952144eaa7f6923c2653a8a088ed3bd1a20d6c8984a058d72 WatchSource:0}: Error finding container 72fb579195803ce952144eaa7f6923c2653a8a088ed3bd1a20d6c8984a058d72: Status 404 returned error can't find the container with id 72fb579195803ce952144eaa7f6923c2653a8a088ed3bd1a20d6c8984a058d72 Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.099804 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.182743 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.304965 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnvwd\" (UniqueName: \"kubernetes.io/projected/0b66163a-a528-44d4-9e9d-4eebd6969c60-kube-api-access-fnvwd\") pod \"0b66163a-a528-44d4-9e9d-4eebd6969c60\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.305035 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-dns-svc\") pod \"0b66163a-a528-44d4-9e9d-4eebd6969c60\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.305090 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-config\") pod \"0b66163a-a528-44d4-9e9d-4eebd6969c60\" (UID: \"0b66163a-a528-44d4-9e9d-4eebd6969c60\") " Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.307686 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-config" (OuterVolumeSpecName: "config") pod "0b66163a-a528-44d4-9e9d-4eebd6969c60" (UID: "0b66163a-a528-44d4-9e9d-4eebd6969c60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.309954 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b66163a-a528-44d4-9e9d-4eebd6969c60" (UID: "0b66163a-a528-44d4-9e9d-4eebd6969c60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.316510 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b66163a-a528-44d4-9e9d-4eebd6969c60-kube-api-access-fnvwd" (OuterVolumeSpecName: "kube-api-access-fnvwd") pod "0b66163a-a528-44d4-9e9d-4eebd6969c60" (UID: "0b66163a-a528-44d4-9e9d-4eebd6969c60"). InnerVolumeSpecName "kube-api-access-fnvwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.378694 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.410594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec13a2bf-8612-4605-a179-dcce0a2dfe06-config\") pod \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.410672 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ghtz\" (UniqueName: \"kubernetes.io/projected/ec13a2bf-8612-4605-a179-dcce0a2dfe06-kube-api-access-4ghtz\") pod \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\" (UID: \"ec13a2bf-8612-4605-a179-dcce0a2dfe06\") " Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.411098 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnvwd\" (UniqueName: \"kubernetes.io/projected/0b66163a-a528-44d4-9e9d-4eebd6969c60-kube-api-access-fnvwd\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.411113 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.411144 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66163a-a528-44d4-9e9d-4eebd6969c60-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.413333 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec13a2bf-8612-4605-a179-dcce0a2dfe06-config" (OuterVolumeSpecName: "config") pod "ec13a2bf-8612-4605-a179-dcce0a2dfe06" (UID: "ec13a2bf-8612-4605-a179-dcce0a2dfe06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.417936 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec13a2bf-8612-4605-a179-dcce0a2dfe06-kube-api-access-4ghtz" (OuterVolumeSpecName: "kube-api-access-4ghtz") pod "ec13a2bf-8612-4605-a179-dcce0a2dfe06" (UID: "ec13a2bf-8612-4605-a179-dcce0a2dfe06"). InnerVolumeSpecName "kube-api-access-4ghtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.428906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" event={"ID":"ec13a2bf-8612-4605-a179-dcce0a2dfe06","Type":"ContainerDied","Data":"720f927b351847761560af09d5fb29a65d7d3017d04e86dd1201c6778a2d94ec"} Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.429009 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8svng" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.450550 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"5787add21e877423617566cc01fd1cd5d93ab12b7726098df3a77184a49fa270"} Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.466599 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c1597214-7e53-46e4-8ba2-3732fc1ebf29","Type":"ContainerStarted","Data":"72fb579195803ce952144eaa7f6923c2653a8a088ed3bd1a20d6c8984a058d72"} Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.474598 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" event={"ID":"0b66163a-a528-44d4-9e9d-4eebd6969c60","Type":"ContainerDied","Data":"1830141efa4b9064796ce79225ef66339c1b22f7115e82cafa999e8c7b7e6604"} Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.474718 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pzxpv" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.520870 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec13a2bf-8612-4605-a179-dcce0a2dfe06-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.522461 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ghtz\" (UniqueName: \"kubernetes.io/projected/ec13a2bf-8612-4605-a179-dcce0a2dfe06-kube-api-access-4ghtz\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.568431 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.617409 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8svng"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.624413 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8svng"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.629673 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tchdq"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.633750 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.639767 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.651204 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pzxpv"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.654950 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pzxpv"] Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.783587 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9mnct"] Dec 04 09:58:36 crc kubenswrapper[4776]: W1204 09:58:36.792465 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd5ed17f_c4f4_4b17_b14c_d8717fc116f6.slice/crio-f000c5917d25762f5eda828cd6bdc88e67b8710034b6e888e92685484dfdc326 WatchSource:0}: Error finding container f000c5917d25762f5eda828cd6bdc88e67b8710034b6e888e92685484dfdc326: Status 404 returned error can't find the container with id f000c5917d25762f5eda828cd6bdc88e67b8710034b6e888e92685484dfdc326 Dec 04 09:58:36 crc kubenswrapper[4776]: I1204 09:58:36.883592 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 09:58:36 crc kubenswrapper[4776]: W1204 09:58:36.890957 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d1e14cd_4110_4ed1_9884_1318d980a844.slice/crio-6241994855a13b759016af3a0c83377690b553345ae73da111134749e4514cfc WatchSource:0}: Error finding container 6241994855a13b759016af3a0c83377690b553345ae73da111134749e4514cfc: Status 404 returned error can't find the container with id 6241994855a13b759016af3a0c83377690b553345ae73da111134749e4514cfc Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.466851 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b66163a-a528-44d4-9e9d-4eebd6969c60" path="/var/lib/kubelet/pods/0b66163a-a528-44d4-9e9d-4eebd6969c60/volumes" Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.467628 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec13a2bf-8612-4605-a179-dcce0a2dfe06" path="/var/lib/kubelet/pods/ec13a2bf-8612-4605-a179-dcce0a2dfe06/volumes" Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.485063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c","Type":"ContainerStarted","Data":"e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.488275 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be0c172a-45d2-4fab-940c-f343c9e227fc","Type":"ContainerStarted","Data":"2739ae3b53f12a4b2278f6c709f799489ad322eda21d4b091c2e34b3b400e9b4"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.490228 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9mnct" event={"ID":"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6","Type":"ContainerStarted","Data":"f000c5917d25762f5eda828cd6bdc88e67b8710034b6e888e92685484dfdc326"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.499332 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c35b05be-3fec-4a42-af88-c80ad4c6833e","Type":"ContainerStarted","Data":"a7e04132f463f9f9dfd2d2a049b80361714313c568c4ca119fb965d9b9d264e5"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.506007 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d1e14cd-4110-4ed1-9884-1318d980a844","Type":"ContainerStarted","Data":"6241994855a13b759016af3a0c83377690b553345ae73da111134749e4514cfc"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.510335 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f57cb5d1-1baa-4fc7-8c71-16d1138dab82","Type":"ContainerStarted","Data":"176599bab83495559a072732d4606c07a81028607f92f33808c416b4de9e977c"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.525784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38109cb-9fe9-429d-b580-999d6978f536","Type":"ContainerStarted","Data":"6bd966f0d41a3ed9245ba6cda3d5c0b23c1ca16cb8c98d18c22099cac9e607bb"} Dec 04 09:58:37 crc kubenswrapper[4776]: I1204 09:58:37.528578 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq" event={"ID":"1100839e-9cfb-4361-a653-321d0d431072","Type":"ContainerStarted","Data":"7a7722f23c7790eaf39933f8fa98b1750ceeafe8948dae6c1033d02430f34f2e"} Dec 04 09:58:46 crc kubenswrapper[4776]: I1204 09:58:46.822536 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c1597214-7e53-46e4-8ba2-3732fc1ebf29","Type":"ContainerStarted","Data":"53cdb71a01277f92f65afba6d28f001c5f31f273c7ee6aa26dba1c3503080d24"} Dec 04 09:58:46 crc kubenswrapper[4776]: I1204 09:58:46.823472 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 09:58:46 crc kubenswrapper[4776]: I1204 09:58:46.824662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be0c172a-45d2-4fab-940c-f343c9e227fc","Type":"ContainerStarted","Data":"8585a5131c17c29fe0ee3c22ee55a52b341962093589ad30e9bec82cf6a40f63"} Dec 04 09:58:46 crc kubenswrapper[4776]: I1204 09:58:46.826794 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c35b05be-3fec-4a42-af88-c80ad4c6833e","Type":"ContainerStarted","Data":"ec3ff6fe05d6d246430ef1fe84cbfea62540fa3bec64e0b54988da54e421574b"} Dec 04 09:58:46 crc kubenswrapper[4776]: I1204 09:58:46.843987 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.506911516 podStartE2EDuration="33.843968403s" podCreationTimestamp="2025-12-04 09:58:13 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.049431174 +0000 UTC m=+1160.915911551" lastFinishedPulling="2025-12-04 09:58:44.386488061 +0000 UTC m=+1169.252968438" observedRunningTime="2025-12-04 09:58:46.839161813 +0000 UTC m=+1171.705642210" watchObservedRunningTime="2025-12-04 09:58:46.843968403 +0000 UTC m=+1171.710448780" Dec 04 09:58:47 crc kubenswrapper[4776]: I1204 09:58:47.834647 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9mnct" event={"ID":"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6","Type":"ContainerStarted","Data":"02661dc69fd263d3d2f33452cbb6d0bc7a23df17334fdef4bc2fd856a53fca92"} Dec 04 09:58:47 crc kubenswrapper[4776]: I1204 09:58:47.836273 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d1e14cd-4110-4ed1-9884-1318d980a844","Type":"ContainerStarted","Data":"4e9ef773dfbd5b7242b8e6880ea9f67bf6f012b46e46ca457a1aa44ae36a7cc5"} Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.845963 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38109cb-9fe9-429d-b580-999d6978f536","Type":"ContainerStarted","Data":"0444ceeeb9f6836270880e3f83671d4bbc4e9f9b3579abb0e673f12244615614"} Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.849002 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq" event={"ID":"1100839e-9cfb-4361-a653-321d0d431072","Type":"ContainerStarted","Data":"07351e90cfe95ad8599b97c48cb5d365ce2bf04166efa401089bc22880d8eb2b"} Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.849136 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-tchdq" Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.851254 4776 generic.go:334] "Generic (PLEG): container finished" podID="bd5ed17f-c4f4-4b17-b14c-d8717fc116f6" containerID="02661dc69fd263d3d2f33452cbb6d0bc7a23df17334fdef4bc2fd856a53fca92" exitCode=0 Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.851352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9mnct" event={"ID":"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6","Type":"ContainerDied","Data":"02661dc69fd263d3d2f33452cbb6d0bc7a23df17334fdef4bc2fd856a53fca92"} Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.853158 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e","Type":"ContainerStarted","Data":"b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c"} Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.856459 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f57cb5d1-1baa-4fc7-8c71-16d1138dab82","Type":"ContainerStarted","Data":"adef3fbc5f17f46ef92e49977033fc192ebb6059ac071131f71e1cf991a17761"} Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.856517 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.933406 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-tchdq" podStartSLOduration=21.231748644 podStartE2EDuration="29.933381494s" podCreationTimestamp="2025-12-04 09:58:19 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.573051141 +0000 UTC m=+1161.439531518" lastFinishedPulling="2025-12-04 09:58:45.274683991 +0000 UTC m=+1170.141164368" observedRunningTime="2025-12-04 09:58:48.930213494 +0000 UTC m=+1173.796693881" watchObservedRunningTime="2025-12-04 09:58:48.933381494 +0000 UTC m=+1173.799861881" Dec 04 09:58:48 crc kubenswrapper[4776]: I1204 09:58:48.986458 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.197879649 podStartE2EDuration="33.986434833s" podCreationTimestamp="2025-12-04 09:58:15 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.576385356 +0000 UTC m=+1161.442865723" lastFinishedPulling="2025-12-04 09:58:46.36494053 +0000 UTC m=+1171.231420907" observedRunningTime="2025-12-04 09:58:48.976341517 +0000 UTC m=+1173.842821914" watchObservedRunningTime="2025-12-04 09:58:48.986434833 +0000 UTC m=+1173.852915210" Dec 04 09:58:49 crc kubenswrapper[4776]: I1204 09:58:49.864432 4776 generic.go:334] "Generic (PLEG): container finished" podID="a246b792-5a73-4f08-9373-8a719864bd7d" containerID="9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e" exitCode=0 Dec 04 09:58:49 crc kubenswrapper[4776]: I1204 09:58:49.864638 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" event={"ID":"a246b792-5a73-4f08-9373-8a719864bd7d","Type":"ContainerDied","Data":"9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e"} Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.874370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9mnct" event={"ID":"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6","Type":"ContainerStarted","Data":"5779d18e1006545b5c3ebc3785aca1bb33d8243ba6cd5415d78b260fa9094139"} Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.874940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9mnct" event={"ID":"bd5ed17f-c4f4-4b17-b14c-d8717fc116f6","Type":"ContainerStarted","Data":"8a1c14ed62426be54d70325284bb6304b36320da1c306d89211f1860f48b8240"} Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.875995 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.876020 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.879164 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" event={"ID":"a246b792-5a73-4f08-9373-8a719864bd7d","Type":"ContainerStarted","Data":"d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb"} Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.879816 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.882456 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" event={"ID":"0b689a22-b93d-4fd7-80ac-6593ade4066a","Type":"ContainerStarted","Data":"f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770"} Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.901494 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9mnct" podStartSLOduration=24.084720456 podStartE2EDuration="31.901470549s" podCreationTimestamp="2025-12-04 09:58:19 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.795327623 +0000 UTC m=+1161.661808000" lastFinishedPulling="2025-12-04 09:58:44.612077716 +0000 UTC m=+1169.478558093" observedRunningTime="2025-12-04 09:58:50.900710845 +0000 UTC m=+1175.767191232" watchObservedRunningTime="2025-12-04 09:58:50.901470549 +0000 UTC m=+1175.767950936" Dec 04 09:58:50 crc kubenswrapper[4776]: I1204 09:58:50.935858 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" podStartSLOduration=4.44085743 podStartE2EDuration="41.935835903s" podCreationTimestamp="2025-12-04 09:58:09 +0000 UTC" firstStartedPulling="2025-12-04 09:58:10.946170665 +0000 UTC m=+1135.812651042" lastFinishedPulling="2025-12-04 09:58:48.441149138 +0000 UTC m=+1173.307629515" observedRunningTime="2025-12-04 09:58:50.930339651 +0000 UTC m=+1175.796820028" watchObservedRunningTime="2025-12-04 09:58:50.935835903 +0000 UTC m=+1175.802316280" Dec 04 09:58:52 crc kubenswrapper[4776]: I1204 09:58:52.002044 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerID="f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770" exitCode=0 Dec 04 09:58:52 crc kubenswrapper[4776]: I1204 09:58:52.002113 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" event={"ID":"0b689a22-b93d-4fd7-80ac-6593ade4066a","Type":"ContainerDied","Data":"f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770"} Dec 04 09:58:53 crc kubenswrapper[4776]: I1204 09:58:53.958076 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.058847 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d1e14cd-4110-4ed1-9884-1318d980a844","Type":"ContainerStarted","Data":"02b35549f2530e37486def84211e07e7c888019e3a94bb3390451b69ff9e9198"} Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.065687 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" event={"ID":"0b689a22-b93d-4fd7-80ac-6593ade4066a","Type":"ContainerStarted","Data":"b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446"} Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.065971 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.068408 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c35b05be-3fec-4a42-af88-c80ad4c6833e","Type":"ContainerStarted","Data":"c1e546b0c0cdfd7837588c5f9610c750b3d64cfe1ecdf15c123295ca643e9e6a"} Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.106181 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.1013268 podStartE2EDuration="34.106158938s" podCreationTimestamp="2025-12-04 09:58:21 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.896054873 +0000 UTC m=+1161.762535250" lastFinishedPulling="2025-12-04 09:58:53.900887011 +0000 UTC m=+1178.767367388" observedRunningTime="2025-12-04 09:58:55.087752042 +0000 UTC m=+1179.954232419" watchObservedRunningTime="2025-12-04 09:58:55.106158938 +0000 UTC m=+1179.972639335" Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.107617 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" podStartSLOduration=-9223371990.74717 podStartE2EDuration="46.107607023s" podCreationTimestamp="2025-12-04 09:58:09 +0000 UTC" firstStartedPulling="2025-12-04 09:58:10.511376336 +0000 UTC m=+1135.377856713" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:58:55.105461836 +0000 UTC m=+1179.971942213" watchObservedRunningTime="2025-12-04 09:58:55.107607023 +0000 UTC m=+1179.974087400" Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.133893 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.552678761 podStartE2EDuration="34.133871995s" podCreationTimestamp="2025-12-04 09:58:21 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.34342773 +0000 UTC m=+1161.209908107" lastFinishedPulling="2025-12-04 09:58:53.924620964 +0000 UTC m=+1178.791101341" observedRunningTime="2025-12-04 09:58:55.129542729 +0000 UTC m=+1179.996023126" watchObservedRunningTime="2025-12-04 09:58:55.133871995 +0000 UTC m=+1180.000352372" Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.315133 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.381555 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sq9zf"] Dec 04 09:58:55 crc kubenswrapper[4776]: I1204 09:58:55.971385 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 09:58:56 crc kubenswrapper[4776]: I1204 09:58:56.129006 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:56 crc kubenswrapper[4776]: I1204 09:58:56.166544 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:56 crc kubenswrapper[4776]: I1204 09:58:56.299370 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:56 crc kubenswrapper[4776]: I1204 09:58:56.338704 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.084278 4776 generic.go:334] "Generic (PLEG): container finished" podID="b38109cb-9fe9-429d-b580-999d6978f536" containerID="0444ceeeb9f6836270880e3f83671d4bbc4e9f9b3579abb0e673f12244615614" exitCode=0 Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.084368 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38109cb-9fe9-429d-b580-999d6978f536","Type":"ContainerDied","Data":"0444ceeeb9f6836270880e3f83671d4bbc4e9f9b3579abb0e673f12244615614"} Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.086160 4776 generic.go:334] "Generic (PLEG): container finished" podID="be0c172a-45d2-4fab-940c-f343c9e227fc" containerID="8585a5131c17c29fe0ee3c22ee55a52b341962093589ad30e9bec82cf6a40f63" exitCode=0 Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.086263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be0c172a-45d2-4fab-940c-f343c9e227fc","Type":"ContainerDied","Data":"8585a5131c17c29fe0ee3c22ee55a52b341962093589ad30e9bec82cf6a40f63"} Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.086361 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerName="dnsmasq-dns" containerID="cri-o://b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446" gracePeriod=10 Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.086588 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.087168 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.135457 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.141866 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.417546 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-chpcr"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.429056 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.436357 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.446776 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-chpcr"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.487203 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-2wd87"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.488362 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.489710 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2wd87"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.506741 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.524173 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.524239 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54s2\" (UniqueName: \"kubernetes.io/projected/8b394862-e729-482e-b741-d4b41a7fc5c1-kube-api-access-b54s2\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.524366 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.524418 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-config\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.528119 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.539443 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.542548 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.542826 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7kxqq" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.543123 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.543262 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.586274 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.608750 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-chpcr"] Dec 04 09:58:57 crc kubenswrapper[4776]: E1204 09:58:57.612363 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-b54s2 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" podUID="8b394862-e729-482e-b741-d4b41a7fc5c1" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632316 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ec49e875-c217-4d3a-b821-a870a4ad1d24-ovs-rundir\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec49e875-c217-4d3a-b821-a870a4ad1d24-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632469 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec49e875-c217-4d3a-b821-a870a4ad1d24-combined-ca-bundle\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632522 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z97nt\" (UniqueName: \"kubernetes.io/projected/ec49e875-c217-4d3a-b821-a870a4ad1d24-kube-api-access-z97nt\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632543 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632564 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec49e875-c217-4d3a-b821-a870a4ad1d24-config\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632627 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b19391ae-29bb-4eef-a99b-c8746488c6f5-scripts\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632656 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632689 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19391ae-29bb-4eef-a99b-c8746488c6f5-config\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632714 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfg7t\" (UniqueName: \"kubernetes.io/projected/b19391ae-29bb-4eef-a99b-c8746488c6f5-kube-api-access-pfg7t\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-config\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632773 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632811 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b19391ae-29bb-4eef-a99b-c8746488c6f5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632833 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632854 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b54s2\" (UniqueName: \"kubernetes.io/projected/8b394862-e729-482e-b741-d4b41a7fc5c1-kube-api-access-b54s2\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.632873 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ec49e875-c217-4d3a-b821-a870a4ad1d24-ovn-rundir\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.634679 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.636172 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-config\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.638468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.665113 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.668073 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54s2\" (UniqueName: \"kubernetes.io/projected/8b394862-e729-482e-b741-d4b41a7fc5c1-kube-api-access-b54s2\") pod \"dnsmasq-dns-7fd796d7df-chpcr\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.669991 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lc7wm"] Dec 04 09:58:57 crc kubenswrapper[4776]: E1204 09:58:57.670374 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerName="dnsmasq-dns" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.670394 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerName="dnsmasq-dns" Dec 04 09:58:57 crc kubenswrapper[4776]: E1204 09:58:57.670427 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerName="init" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.670436 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerName="init" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.670648 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerName="dnsmasq-dns" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.671683 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.685631 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.692372 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lc7wm"] Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734178 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b19391ae-29bb-4eef-a99b-c8746488c6f5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734287 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ec49e875-c217-4d3a-b821-a870a4ad1d24-ovn-rundir\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734367 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ec49e875-c217-4d3a-b821-a870a4ad1d24-ovs-rundir\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734391 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec49e875-c217-4d3a-b821-a870a4ad1d24-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734425 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec49e875-c217-4d3a-b821-a870a4ad1d24-combined-ca-bundle\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z97nt\" (UniqueName: \"kubernetes.io/projected/ec49e875-c217-4d3a-b821-a870a4ad1d24-kube-api-access-z97nt\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734497 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec49e875-c217-4d3a-b821-a870a4ad1d24-config\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734527 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b19391ae-29bb-4eef-a99b-c8746488c6f5-scripts\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734561 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19391ae-29bb-4eef-a99b-c8746488c6f5-config\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.734582 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfg7t\" (UniqueName: \"kubernetes.io/projected/b19391ae-29bb-4eef-a99b-c8746488c6f5-kube-api-access-pfg7t\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.735395 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b19391ae-29bb-4eef-a99b-c8746488c6f5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.740376 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b19391ae-29bb-4eef-a99b-c8746488c6f5-scripts\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.740696 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ec49e875-c217-4d3a-b821-a870a4ad1d24-ovs-rundir\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.740759 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ec49e875-c217-4d3a-b821-a870a4ad1d24-ovn-rundir\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.740821 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec49e875-c217-4d3a-b821-a870a4ad1d24-combined-ca-bundle\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.741110 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19391ae-29bb-4eef-a99b-c8746488c6f5-config\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.742123 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.743276 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.744374 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec49e875-c217-4d3a-b821-a870a4ad1d24-config\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.745694 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec49e875-c217-4d3a-b821-a870a4ad1d24-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.746073 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b19391ae-29bb-4eef-a99b-c8746488c6f5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.757191 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfg7t\" (UniqueName: \"kubernetes.io/projected/b19391ae-29bb-4eef-a99b-c8746488c6f5-kube-api-access-pfg7t\") pod \"ovn-northd-0\" (UID: \"b19391ae-29bb-4eef-a99b-c8746488c6f5\") " pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.761514 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z97nt\" (UniqueName: \"kubernetes.io/projected/ec49e875-c217-4d3a-b821-a870a4ad1d24-kube-api-access-z97nt\") pod \"ovn-controller-metrics-2wd87\" (UID: \"ec49e875-c217-4d3a-b821-a870a4ad1d24\") " pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836119 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-config\") pod \"0b689a22-b93d-4fd7-80ac-6593ade4066a\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836243 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-dns-svc\") pod \"0b689a22-b93d-4fd7-80ac-6593ade4066a\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836331 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfhrh\" (UniqueName: \"kubernetes.io/projected/0b689a22-b93d-4fd7-80ac-6593ade4066a-kube-api-access-kfhrh\") pod \"0b689a22-b93d-4fd7-80ac-6593ade4066a\" (UID: \"0b689a22-b93d-4fd7-80ac-6593ade4066a\") " Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836672 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836770 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-config\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836810 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.836990 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lttjg\" (UniqueName: \"kubernetes.io/projected/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-kube-api-access-lttjg\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.842952 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b689a22-b93d-4fd7-80ac-6593ade4066a-kube-api-access-kfhrh" (OuterVolumeSpecName: "kube-api-access-kfhrh") pod "0b689a22-b93d-4fd7-80ac-6593ade4066a" (UID: "0b689a22-b93d-4fd7-80ac-6593ade4066a"). InnerVolumeSpecName "kube-api-access-kfhrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.869891 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2wd87" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.886474 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.886608 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-config" (OuterVolumeSpecName: "config") pod "0b689a22-b93d-4fd7-80ac-6593ade4066a" (UID: "0b689a22-b93d-4fd7-80ac-6593ade4066a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.893355 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b689a22-b93d-4fd7-80ac-6593ade4066a" (UID: "0b689a22-b93d-4fd7-80ac-6593ade4066a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.939996 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lttjg\" (UniqueName: \"kubernetes.io/projected/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-kube-api-access-lttjg\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.940118 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.940149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.941024 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.941178 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-config\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.941196 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.941966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-config\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.942019 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.942102 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfhrh\" (UniqueName: \"kubernetes.io/projected/0b689a22-b93d-4fd7-80ac-6593ade4066a-kube-api-access-kfhrh\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.942117 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.942129 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b689a22-b93d-4fd7-80ac-6593ade4066a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.942672 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:57 crc kubenswrapper[4776]: I1204 09:58:57.963677 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lttjg\" (UniqueName: \"kubernetes.io/projected/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-kube-api-access-lttjg\") pod \"dnsmasq-dns-86db49b7ff-lc7wm\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.017345 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.160472 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b689a22-b93d-4fd7-80ac-6593ade4066a" containerID="b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446" exitCode=0 Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.160543 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" event={"ID":"0b689a22-b93d-4fd7-80ac-6593ade4066a","Type":"ContainerDied","Data":"b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446"} Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.160568 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" event={"ID":"0b689a22-b93d-4fd7-80ac-6593ade4066a","Type":"ContainerDied","Data":"8f0f26684ea4ef1bc9304ddb05da6326d7c74379708ae6aeb6c1953836930481"} Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.160585 4776 scope.go:117] "RemoveContainer" containerID="b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.160701 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-sq9zf" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.220067 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sq9zf"] Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.223761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be0c172a-45d2-4fab-940c-f343c9e227fc","Type":"ContainerStarted","Data":"d56fa012bf0f6e8b7b136fbe7f278888c6ce13cf0f4b996522cdd8f3c1adff16"} Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.235407 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b38109cb-9fe9-429d-b580-999d6978f536","Type":"ContainerStarted","Data":"165451e6c67730757bd116bf3fdb1420560a469dd6c46dc5360cbd623ea2c05a"} Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.235835 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.237805 4776 scope.go:117] "RemoveContainer" containerID="f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.250832 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.258886 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-sq9zf"] Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.275500 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=38.291673075 podStartE2EDuration="46.275473654s" podCreationTimestamp="2025-12-04 09:58:12 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.624652625 +0000 UTC m=+1161.491132992" lastFinishedPulling="2025-12-04 09:58:44.608453194 +0000 UTC m=+1169.474933571" observedRunningTime="2025-12-04 09:58:58.254779607 +0000 UTC m=+1183.121259994" watchObservedRunningTime="2025-12-04 09:58:58.275473654 +0000 UTC m=+1183.141954031" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.290466 4776 scope.go:117] "RemoveContainer" containerID="b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446" Dec 04 09:58:58 crc kubenswrapper[4776]: E1204 09:58:58.304164 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446\": container with ID starting with b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446 not found: ID does not exist" containerID="b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.304213 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446"} err="failed to get container status \"b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446\": rpc error: code = NotFound desc = could not find container \"b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446\": container with ID starting with b0b9bb8b70b389c1253a4d8942a40f3be3942f1b3a6e95e9272f49fc681c9446 not found: ID does not exist" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.304242 4776 scope.go:117] "RemoveContainer" containerID="f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770" Dec 04 09:58:58 crc kubenswrapper[4776]: E1204 09:58:58.305421 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770\": container with ID starting with f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770 not found: ID does not exist" containerID="f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.305441 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770"} err="failed to get container status \"f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770\": rpc error: code = NotFound desc = could not find container \"f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770\": container with ID starting with f29d45a1e3045881900f9987b9bcaf2e8981fbeccc53bcd7df5fa4add70b2770 not found: ID does not exist" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.354627 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-ovsdbserver-nb\") pod \"8b394862-e729-482e-b741-d4b41a7fc5c1\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.354690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b54s2\" (UniqueName: \"kubernetes.io/projected/8b394862-e729-482e-b741-d4b41a7fc5c1-kube-api-access-b54s2\") pod \"8b394862-e729-482e-b741-d4b41a7fc5c1\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.354741 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-dns-svc\") pod \"8b394862-e729-482e-b741-d4b41a7fc5c1\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.354996 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-config\") pod \"8b394862-e729-482e-b741-d4b41a7fc5c1\" (UID: \"8b394862-e729-482e-b741-d4b41a7fc5c1\") " Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.357244 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b394862-e729-482e-b741-d4b41a7fc5c1" (UID: "8b394862-e729-482e-b741-d4b41a7fc5c1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.360328 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b394862-e729-482e-b741-d4b41a7fc5c1" (UID: "8b394862-e729-482e-b741-d4b41a7fc5c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.361096 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-config" (OuterVolumeSpecName: "config") pod "8b394862-e729-482e-b741-d4b41a7fc5c1" (UID: "8b394862-e729-482e-b741-d4b41a7fc5c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.377695 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b394862-e729-482e-b741-d4b41a7fc5c1-kube-api-access-b54s2" (OuterVolumeSpecName: "kube-api-access-b54s2") pod "8b394862-e729-482e-b741-d4b41a7fc5c1" (UID: "8b394862-e729-482e-b741-d4b41a7fc5c1"). InnerVolumeSpecName "kube-api-access-b54s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.457113 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.457161 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.457180 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b54s2\" (UniqueName: \"kubernetes.io/projected/8b394862-e729-482e-b741-d4b41a7fc5c1-kube-api-access-b54s2\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.457191 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b394862-e729-482e-b741-d4b41a7fc5c1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.595502 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=38.078724539 podStartE2EDuration="47.595482713s" podCreationTimestamp="2025-12-04 09:58:11 +0000 UTC" firstStartedPulling="2025-12-04 09:58:36.627279237 +0000 UTC m=+1161.493759614" lastFinishedPulling="2025-12-04 09:58:46.144037411 +0000 UTC m=+1171.010517788" observedRunningTime="2025-12-04 09:58:58.322746013 +0000 UTC m=+1183.189226390" watchObservedRunningTime="2025-12-04 09:58:58.595482713 +0000 UTC m=+1183.461963090" Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.598702 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2wd87"] Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.653619 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 09:58:58 crc kubenswrapper[4776]: W1204 09:58:58.661726 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb19391ae_29bb_4eef_a99b_c8746488c6f5.slice/crio-39e3d5e076fb7e82cf18ef48de5c4caf02e7e9166d6a1f26da4b45e5510618c4 WatchSource:0}: Error finding container 39e3d5e076fb7e82cf18ef48de5c4caf02e7e9166d6a1f26da4b45e5510618c4: Status 404 returned error can't find the container with id 39e3d5e076fb7e82cf18ef48de5c4caf02e7e9166d6a1f26da4b45e5510618c4 Dec 04 09:58:58 crc kubenswrapper[4776]: I1204 09:58:58.746003 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lc7wm"] Dec 04 09:58:58 crc kubenswrapper[4776]: W1204 09:58:58.750612 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5d641b_7a26_465b_90c8_ce70bb3ddcb9.slice/crio-aac3184854f83782feca7a6a10c73f7f096e70e8d24d520595b373edb02adbc1 WatchSource:0}: Error finding container aac3184854f83782feca7a6a10c73f7f096e70e8d24d520595b373edb02adbc1: Status 404 returned error can't find the container with id aac3184854f83782feca7a6a10c73f7f096e70e8d24d520595b373edb02adbc1 Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.245290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b19391ae-29bb-4eef-a99b-c8746488c6f5","Type":"ContainerStarted","Data":"39e3d5e076fb7e82cf18ef48de5c4caf02e7e9166d6a1f26da4b45e5510618c4"} Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.250760 4776 generic.go:334] "Generic (PLEG): container finished" podID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerID="f4f7150290d9d229bb908f236d53e21e3fbbeb63a65302dcbf1ee1f05aa5f2f2" exitCode=0 Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.250983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" event={"ID":"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9","Type":"ContainerDied","Data":"f4f7150290d9d229bb908f236d53e21e3fbbeb63a65302dcbf1ee1f05aa5f2f2"} Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.251046 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" event={"ID":"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9","Type":"ContainerStarted","Data":"aac3184854f83782feca7a6a10c73f7f096e70e8d24d520595b373edb02adbc1"} Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.255971 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2wd87" event={"ID":"ec49e875-c217-4d3a-b821-a870a4ad1d24","Type":"ContainerStarted","Data":"429492833dc80a8971c5c65fc184829857826ebc009b1c22e3be611c179c3be2"} Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.256050 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2wd87" event={"ID":"ec49e875-c217-4d3a-b821-a870a4ad1d24","Type":"ContainerStarted","Data":"d92abb41b3687858603785ce28d4e8cfc5892438788fd2c16eb43fd19b19802b"} Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.259494 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-chpcr" Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.304943 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-2wd87" podStartSLOduration=2.304910952 podStartE2EDuration="2.304910952s" podCreationTimestamp="2025-12-04 09:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:58:59.302106254 +0000 UTC m=+1184.168586631" watchObservedRunningTime="2025-12-04 09:58:59.304910952 +0000 UTC m=+1184.171391329" Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.405766 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-chpcr"] Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.408642 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-chpcr"] Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.467316 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b689a22-b93d-4fd7-80ac-6593ade4066a" path="/var/lib/kubelet/pods/0b689a22-b93d-4fd7-80ac-6593ade4066a/volumes" Dec 04 09:58:59 crc kubenswrapper[4776]: I1204 09:58:59.468417 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b394862-e729-482e-b741-d4b41a7fc5c1" path="/var/lib/kubelet/pods/8b394862-e729-482e-b741-d4b41a7fc5c1/volumes" Dec 04 09:59:00 crc kubenswrapper[4776]: I1204 09:59:00.269458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" event={"ID":"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9","Type":"ContainerStarted","Data":"d7bc5307933b23c3faaaf69ca315ed8153afe1908a6978577a1085fadd6932be"} Dec 04 09:59:00 crc kubenswrapper[4776]: I1204 09:59:00.270745 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:59:00 crc kubenswrapper[4776]: I1204 09:59:00.294882 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" podStartSLOduration=3.294858644 podStartE2EDuration="3.294858644s" podCreationTimestamp="2025-12-04 09:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:00.287852425 +0000 UTC m=+1185.154332822" watchObservedRunningTime="2025-12-04 09:59:00.294858644 +0000 UTC m=+1185.161339021" Dec 04 09:59:02 crc kubenswrapper[4776]: E1204 09:59:02.148559 4776 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.173:54084->38.102.83.173:42595: write tcp 38.102.83.173:54084->38.102.83.173:42595: write: connection reset by peer Dec 04 09:59:02 crc kubenswrapper[4776]: I1204 09:59:02.285458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b19391ae-29bb-4eef-a99b-c8746488c6f5","Type":"ContainerStarted","Data":"2660be926c5d3412ba7f2bf665cb7eb490f0b3b3aab5a1466c439dffd4504f12"} Dec 04 09:59:02 crc kubenswrapper[4776]: I1204 09:59:02.285516 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b19391ae-29bb-4eef-a99b-c8746488c6f5","Type":"ContainerStarted","Data":"d380a137d5744e44d362e71dad1663a9dee82f67f897a0323265da240659457c"} Dec 04 09:59:02 crc kubenswrapper[4776]: I1204 09:59:02.285638 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 09:59:02 crc kubenswrapper[4776]: I1204 09:59:02.308083 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.482875198 podStartE2EDuration="5.30806189s" podCreationTimestamp="2025-12-04 09:58:57 +0000 UTC" firstStartedPulling="2025-12-04 09:58:58.663355346 +0000 UTC m=+1183.529835723" lastFinishedPulling="2025-12-04 09:59:01.488542038 +0000 UTC m=+1186.355022415" observedRunningTime="2025-12-04 09:59:02.303682654 +0000 UTC m=+1187.170163041" watchObservedRunningTime="2025-12-04 09:59:02.30806189 +0000 UTC m=+1187.174542267" Dec 04 09:59:02 crc kubenswrapper[4776]: I1204 09:59:02.993908 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 09:59:02 crc kubenswrapper[4776]: I1204 09:59:02.993978 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 09:59:03 crc kubenswrapper[4776]: I1204 09:59:03.862884 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 09:59:03 crc kubenswrapper[4776]: I1204 09:59:03.863229 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 09:59:04 crc kubenswrapper[4776]: I1204 09:59:04.558223 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 09:59:04 crc kubenswrapper[4776]: I1204 09:59:04.630241 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 09:59:05 crc kubenswrapper[4776]: I1204 09:59:05.198727 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 09:59:05 crc kubenswrapper[4776]: I1204 09:59:05.276468 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 09:59:08 crc kubenswrapper[4776]: I1204 09:59:08.020160 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 09:59:08 crc kubenswrapper[4776]: I1204 09:59:08.069821 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lhh6z"] Dec 04 09:59:08 crc kubenswrapper[4776]: I1204 09:59:08.070115 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" containerName="dnsmasq-dns" containerID="cri-o://d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb" gracePeriod=10 Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.062543 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.134058 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-config\") pod \"a246b792-5a73-4f08-9373-8a719864bd7d\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.134975 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-dns-svc\") pod \"a246b792-5a73-4f08-9373-8a719864bd7d\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.135151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq2lg\" (UniqueName: \"kubernetes.io/projected/a246b792-5a73-4f08-9373-8a719864bd7d-kube-api-access-cq2lg\") pod \"a246b792-5a73-4f08-9373-8a719864bd7d\" (UID: \"a246b792-5a73-4f08-9373-8a719864bd7d\") " Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.144130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a246b792-5a73-4f08-9373-8a719864bd7d-kube-api-access-cq2lg" (OuterVolumeSpecName: "kube-api-access-cq2lg") pod "a246b792-5a73-4f08-9373-8a719864bd7d" (UID: "a246b792-5a73-4f08-9373-8a719864bd7d"). InnerVolumeSpecName "kube-api-access-cq2lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.236544 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-config" (OuterVolumeSpecName: "config") pod "a246b792-5a73-4f08-9373-8a719864bd7d" (UID: "a246b792-5a73-4f08-9373-8a719864bd7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.236849 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.236874 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq2lg\" (UniqueName: \"kubernetes.io/projected/a246b792-5a73-4f08-9373-8a719864bd7d-kube-api-access-cq2lg\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.308544 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a246b792-5a73-4f08-9373-8a719864bd7d" (UID: "a246b792-5a73-4f08-9373-8a719864bd7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.337935 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a246b792-5a73-4f08-9373-8a719864bd7d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.347063 4776 generic.go:334] "Generic (PLEG): container finished" podID="a246b792-5a73-4f08-9373-8a719864bd7d" containerID="d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb" exitCode=0 Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.347134 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" event={"ID":"a246b792-5a73-4f08-9373-8a719864bd7d","Type":"ContainerDied","Data":"d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb"} Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.347139 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.347164 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lhh6z" event={"ID":"a246b792-5a73-4f08-9373-8a719864bd7d","Type":"ContainerDied","Data":"8f1a03e4760b4786145c8cdeaab339cef724db480e092e4ac44f6a53cb0a59a6"} Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.347181 4776 scope.go:117] "RemoveContainer" containerID="d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.353394 4776 generic.go:334] "Generic (PLEG): container finished" podID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerID="e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a" exitCode=0 Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.353442 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c","Type":"ContainerDied","Data":"e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a"} Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.380060 4776 scope.go:117] "RemoveContainer" containerID="9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.544719 4776 scope.go:117] "RemoveContainer" containerID="d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb" Dec 04 09:59:09 crc kubenswrapper[4776]: E1204 09:59:09.545340 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb\": container with ID starting with d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb not found: ID does not exist" containerID="d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.545389 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb"} err="failed to get container status \"d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb\": rpc error: code = NotFound desc = could not find container \"d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb\": container with ID starting with d9a11f11e01d11e4fee8eeec5c79a02cf5a9a4831bc302f93c070b38b8bd6ebb not found: ID does not exist" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.545414 4776 scope.go:117] "RemoveContainer" containerID="9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e" Dec 04 09:59:09 crc kubenswrapper[4776]: E1204 09:59:09.547474 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e\": container with ID starting with 9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e not found: ID does not exist" containerID="9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.547514 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e"} err="failed to get container status \"9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e\": rpc error: code = NotFound desc = could not find container \"9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e\": container with ID starting with 9185a48334d81fe80d0dfb267f8483d10a8ca285b4cd7abebdb82d3c1beaa08e not found: ID does not exist" Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.562807 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lhh6z"] Dec 04 09:59:09 crc kubenswrapper[4776]: I1204 09:59:09.571106 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lhh6z"] Dec 04 09:59:10 crc kubenswrapper[4776]: I1204 09:59:10.370011 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c","Type":"ContainerStarted","Data":"ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca"} Dec 04 09:59:10 crc kubenswrapper[4776]: I1204 09:59:10.370496 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:59:10 crc kubenswrapper[4776]: I1204 09:59:10.401195 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.487937826 podStartE2EDuration="1m1.401178998s" podCreationTimestamp="2025-12-04 09:58:09 +0000 UTC" firstStartedPulling="2025-12-04 09:58:11.944720966 +0000 UTC m=+1136.811201383" lastFinishedPulling="2025-12-04 09:58:34.857962178 +0000 UTC m=+1159.724442555" observedRunningTime="2025-12-04 09:59:10.398657639 +0000 UTC m=+1195.265138036" watchObservedRunningTime="2025-12-04 09:59:10.401178998 +0000 UTC m=+1195.267659375" Dec 04 09:59:11 crc kubenswrapper[4776]: I1204 09:59:11.462669 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" path="/var/lib/kubelet/pods/a246b792-5a73-4f08-9373-8a719864bd7d/volumes" Dec 04 09:59:12 crc kubenswrapper[4776]: I1204 09:59:12.974218 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.720992 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e236-account-create-update-n8sb9"] Dec 04 09:59:13 crc kubenswrapper[4776]: E1204 09:59:13.721437 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" containerName="dnsmasq-dns" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.721457 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" containerName="dnsmasq-dns" Dec 04 09:59:13 crc kubenswrapper[4776]: E1204 09:59:13.721485 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" containerName="init" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.721492 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" containerName="init" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.721693 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a246b792-5a73-4f08-9373-8a719864bd7d" containerName="dnsmasq-dns" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.722409 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.724798 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.728700 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-z6jmg"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.729765 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.741344 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z6jmg"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.754983 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e236-account-create-update-n8sb9"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.814879 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0796537-9ea0-42b5-9701-04487a4ca241-operator-scripts\") pod \"keystone-e236-account-create-update-n8sb9\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.815056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngcn\" (UniqueName: \"kubernetes.io/projected/f0796537-9ea0-42b5-9701-04487a4ca241-kube-api-access-xngcn\") pod \"keystone-e236-account-create-update-n8sb9\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.815237 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-operator-scripts\") pod \"keystone-db-create-z6jmg\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.815274 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvknw\" (UniqueName: \"kubernetes.io/projected/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-kube-api-access-qvknw\") pod \"keystone-db-create-z6jmg\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.890183 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jdlgz"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.891527 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.907968 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e29f-account-create-update-qpv5v"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.909228 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.911738 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.917544 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0796537-9ea0-42b5-9701-04487a4ca241-operator-scripts\") pod \"keystone-e236-account-create-update-n8sb9\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.917598 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xngcn\" (UniqueName: \"kubernetes.io/projected/f0796537-9ea0-42b5-9701-04487a4ca241-kube-api-access-xngcn\") pod \"keystone-e236-account-create-update-n8sb9\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.917633 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1551b7-3a64-4400-b3b6-8b3e1334401e-operator-scripts\") pod \"placement-db-create-jdlgz\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.917679 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7btfg\" (UniqueName: \"kubernetes.io/projected/fb1551b7-3a64-4400-b3b6-8b3e1334401e-kube-api-access-7btfg\") pod \"placement-db-create-jdlgz\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.917701 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-operator-scripts\") pod \"keystone-db-create-z6jmg\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.917722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvknw\" (UniqueName: \"kubernetes.io/projected/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-kube-api-access-qvknw\") pod \"keystone-db-create-z6jmg\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.918656 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0796537-9ea0-42b5-9701-04487a4ca241-operator-scripts\") pod \"keystone-e236-account-create-update-n8sb9\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.919365 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-operator-scripts\") pod \"keystone-db-create-z6jmg\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.940880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jdlgz"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.959040 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngcn\" (UniqueName: \"kubernetes.io/projected/f0796537-9ea0-42b5-9701-04487a4ca241-kube-api-access-xngcn\") pod \"keystone-e236-account-create-update-n8sb9\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.967307 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e29f-account-create-update-qpv5v"] Dec 04 09:59:13 crc kubenswrapper[4776]: I1204 09:59:13.969770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvknw\" (UniqueName: \"kubernetes.io/projected/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-kube-api-access-qvknw\") pod \"keystone-db-create-z6jmg\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.018797 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468626e9-c715-4f2d-bb1e-35f3ac706a17-operator-scripts\") pod \"placement-e29f-account-create-update-qpv5v\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.018934 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1551b7-3a64-4400-b3b6-8b3e1334401e-operator-scripts\") pod \"placement-db-create-jdlgz\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.018987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7btfg\" (UniqueName: \"kubernetes.io/projected/fb1551b7-3a64-4400-b3b6-8b3e1334401e-kube-api-access-7btfg\") pod \"placement-db-create-jdlgz\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.019039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8t6w\" (UniqueName: \"kubernetes.io/projected/468626e9-c715-4f2d-bb1e-35f3ac706a17-kube-api-access-k8t6w\") pod \"placement-e29f-account-create-update-qpv5v\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.019888 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1551b7-3a64-4400-b3b6-8b3e1334401e-operator-scripts\") pod \"placement-db-create-jdlgz\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.036380 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7btfg\" (UniqueName: \"kubernetes.io/projected/fb1551b7-3a64-4400-b3b6-8b3e1334401e-kube-api-access-7btfg\") pod \"placement-db-create-jdlgz\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.047736 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.063355 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.120149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8t6w\" (UniqueName: \"kubernetes.io/projected/468626e9-c715-4f2d-bb1e-35f3ac706a17-kube-api-access-k8t6w\") pod \"placement-e29f-account-create-update-qpv5v\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.120244 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468626e9-c715-4f2d-bb1e-35f3ac706a17-operator-scripts\") pod \"placement-e29f-account-create-update-qpv5v\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.120978 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468626e9-c715-4f2d-bb1e-35f3ac706a17-operator-scripts\") pod \"placement-e29f-account-create-update-qpv5v\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.140903 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8t6w\" (UniqueName: \"kubernetes.io/projected/468626e9-c715-4f2d-bb1e-35f3ac706a17-kube-api-access-k8t6w\") pod \"placement-e29f-account-create-update-qpv5v\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.184249 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gh7dv"] Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.185341 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.192674 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gh7dv"] Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.221636 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.226621 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203a97c-dc5b-4a58-bb5c-f826221c87f3-operator-scripts\") pod \"glance-db-create-gh7dv\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.226787 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbgb5\" (UniqueName: \"kubernetes.io/projected/c203a97c-dc5b-4a58-bb5c-f826221c87f3-kube-api-access-wbgb5\") pod \"glance-db-create-gh7dv\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.255830 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.294666 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e717-account-create-update-qc9bj"] Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.296222 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.318753 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.327904 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad3ed01-a668-4335-9254-46a2c1704e90-operator-scripts\") pod \"glance-e717-account-create-update-qc9bj\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.328032 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbgb5\" (UniqueName: \"kubernetes.io/projected/c203a97c-dc5b-4a58-bb5c-f826221c87f3-kube-api-access-wbgb5\") pod \"glance-db-create-gh7dv\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.328137 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gccm\" (UniqueName: \"kubernetes.io/projected/bad3ed01-a668-4335-9254-46a2c1704e90-kube-api-access-9gccm\") pod \"glance-e717-account-create-update-qc9bj\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.328207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203a97c-dc5b-4a58-bb5c-f826221c87f3-operator-scripts\") pod \"glance-db-create-gh7dv\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.329087 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203a97c-dc5b-4a58-bb5c-f826221c87f3-operator-scripts\") pod \"glance-db-create-gh7dv\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.335149 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e717-account-create-update-qc9bj"] Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.349862 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbgb5\" (UniqueName: \"kubernetes.io/projected/c203a97c-dc5b-4a58-bb5c-f826221c87f3-kube-api-access-wbgb5\") pod \"glance-db-create-gh7dv\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.431089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gccm\" (UniqueName: \"kubernetes.io/projected/bad3ed01-a668-4335-9254-46a2c1704e90-kube-api-access-9gccm\") pod \"glance-e717-account-create-update-qc9bj\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.431215 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad3ed01-a668-4335-9254-46a2c1704e90-operator-scripts\") pod \"glance-e717-account-create-update-qc9bj\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.432211 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad3ed01-a668-4335-9254-46a2c1704e90-operator-scripts\") pod \"glance-e717-account-create-update-qc9bj\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.451158 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gccm\" (UniqueName: \"kubernetes.io/projected/bad3ed01-a668-4335-9254-46a2c1704e90-kube-api-access-9gccm\") pod \"glance-e717-account-create-update-qc9bj\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.505292 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.656554 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.754986 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e236-account-create-update-n8sb9"] Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.786723 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z6jmg"] Dec 04 09:59:14 crc kubenswrapper[4776]: W1204 09:59:14.805885 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae68b78_c26b_4177_a1bf_7b5cef7a4f4a.slice/crio-999474c66069f784e35d9722d8967a6c503f85eb5dad4287bc5b53f336740fcb WatchSource:0}: Error finding container 999474c66069f784e35d9722d8967a6c503f85eb5dad4287bc5b53f336740fcb: Status 404 returned error can't find the container with id 999474c66069f784e35d9722d8967a6c503f85eb5dad4287bc5b53f336740fcb Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.864602 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jdlgz"] Dec 04 09:59:14 crc kubenswrapper[4776]: I1204 09:59:14.877337 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e29f-account-create-update-qpv5v"] Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.020367 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gh7dv"] Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.188797 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e717-account-create-update-qc9bj"] Dec 04 09:59:15 crc kubenswrapper[4776]: W1204 09:59:15.191177 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbad3ed01_a668_4335_9254_46a2c1704e90.slice/crio-60c310a6ae8518651c1604e34d0023e7c93ecead6d4a1face2b66e07f4d1f0d5 WatchSource:0}: Error finding container 60c310a6ae8518651c1604e34d0023e7c93ecead6d4a1face2b66e07f4d1f0d5: Status 404 returned error can't find the container with id 60c310a6ae8518651c1604e34d0023e7c93ecead6d4a1face2b66e07f4d1f0d5 Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.423304 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e29f-account-create-update-qpv5v" event={"ID":"468626e9-c715-4f2d-bb1e-35f3ac706a17","Type":"ContainerStarted","Data":"360b51b5c7bc3bcd47ecd894ebf652c2aed70e580fffa8bbc24264fd49b7bbe4"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.423667 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e29f-account-create-update-qpv5v" event={"ID":"468626e9-c715-4f2d-bb1e-35f3ac706a17","Type":"ContainerStarted","Data":"826e29d1abdf0aefb0d68089cd5e9cd584115e90fb62541a7d560249a4148d8c"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.426298 4776 generic.go:334] "Generic (PLEG): container finished" podID="9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" containerID="7e49031e0c831ace68a19564a059c357bd6627a28dfe316983b9b924af419fb4" exitCode=0 Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.426480 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z6jmg" event={"ID":"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a","Type":"ContainerDied","Data":"7e49031e0c831ace68a19564a059c357bd6627a28dfe316983b9b924af419fb4"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.426584 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z6jmg" event={"ID":"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a","Type":"ContainerStarted","Data":"999474c66069f784e35d9722d8967a6c503f85eb5dad4287bc5b53f336740fcb"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.428076 4776 generic.go:334] "Generic (PLEG): container finished" podID="f0796537-9ea0-42b5-9701-04487a4ca241" containerID="d2bac99ed10b7b300952dfcd9a61aa4b777eb14de2a46a92993167bcfdd7d941" exitCode=0 Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.428232 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e236-account-create-update-n8sb9" event={"ID":"f0796537-9ea0-42b5-9701-04487a4ca241","Type":"ContainerDied","Data":"d2bac99ed10b7b300952dfcd9a61aa4b777eb14de2a46a92993167bcfdd7d941"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.428348 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e236-account-create-update-n8sb9" event={"ID":"f0796537-9ea0-42b5-9701-04487a4ca241","Type":"ContainerStarted","Data":"66fdcda8c3f81765e517cce096b20fd1fb0021bdda2eead69e5d42c2f3e54386"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.429699 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e717-account-create-update-qc9bj" event={"ID":"bad3ed01-a668-4335-9254-46a2c1704e90","Type":"ContainerStarted","Data":"60c310a6ae8518651c1604e34d0023e7c93ecead6d4a1face2b66e07f4d1f0d5"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.431237 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jdlgz" event={"ID":"fb1551b7-3a64-4400-b3b6-8b3e1334401e","Type":"ContainerStarted","Data":"24ca719a906051ec2d37965d52f25285963d82481e5aec13b480bd9e3d13d3f7"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.432556 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gh7dv" event={"ID":"c203a97c-dc5b-4a58-bb5c-f826221c87f3","Type":"ContainerStarted","Data":"7782d938fd9ce3281ebd5288d7c13c5c897299bff6c0a0891db5125db2b4d794"} Dec 04 09:59:15 crc kubenswrapper[4776]: I1204 09:59:15.457007 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-e29f-account-create-update-qpv5v" podStartSLOduration=2.4569893990000002 podStartE2EDuration="2.456989399s" podCreationTimestamp="2025-12-04 09:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:15.456588526 +0000 UTC m=+1200.323068903" watchObservedRunningTime="2025-12-04 09:59:15.456989399 +0000 UTC m=+1200.323469776" Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.441946 4776 generic.go:334] "Generic (PLEG): container finished" podID="468626e9-c715-4f2d-bb1e-35f3ac706a17" containerID="360b51b5c7bc3bcd47ecd894ebf652c2aed70e580fffa8bbc24264fd49b7bbe4" exitCode=0 Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.442040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e29f-account-create-update-qpv5v" event={"ID":"468626e9-c715-4f2d-bb1e-35f3ac706a17","Type":"ContainerDied","Data":"360b51b5c7bc3bcd47ecd894ebf652c2aed70e580fffa8bbc24264fd49b7bbe4"} Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.446289 4776 generic.go:334] "Generic (PLEG): container finished" podID="bad3ed01-a668-4335-9254-46a2c1704e90" containerID="503a98ebf65c564782a93d3715e1c72685e279fcdc9276605a89ae86165428e1" exitCode=0 Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.446439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e717-account-create-update-qc9bj" event={"ID":"bad3ed01-a668-4335-9254-46a2c1704e90","Type":"ContainerDied","Data":"503a98ebf65c564782a93d3715e1c72685e279fcdc9276605a89ae86165428e1"} Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.448879 4776 generic.go:334] "Generic (PLEG): container finished" podID="fb1551b7-3a64-4400-b3b6-8b3e1334401e" containerID="a63949576df1f8f195a4298a3b52d20e47d84f4feddd41f7960c58ec54a73564" exitCode=0 Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.449028 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jdlgz" event={"ID":"fb1551b7-3a64-4400-b3b6-8b3e1334401e","Type":"ContainerDied","Data":"a63949576df1f8f195a4298a3b52d20e47d84f4feddd41f7960c58ec54a73564"} Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.451175 4776 generic.go:334] "Generic (PLEG): container finished" podID="c203a97c-dc5b-4a58-bb5c-f826221c87f3" containerID="aead5ee0ef4ce7755ac198ac9300d19e56c02ec824ec6ecf9ed70caebc744169" exitCode=0 Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.451398 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gh7dv" event={"ID":"c203a97c-dc5b-4a58-bb5c-f826221c87f3","Type":"ContainerDied","Data":"aead5ee0ef4ce7755ac198ac9300d19e56c02ec824ec6ecf9ed70caebc744169"} Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.896619 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:16 crc kubenswrapper[4776]: I1204 09:59:16.904015 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.097489 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0796537-9ea0-42b5-9701-04487a4ca241-operator-scripts\") pod \"f0796537-9ea0-42b5-9701-04487a4ca241\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.097832 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-operator-scripts\") pod \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.098018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xngcn\" (UniqueName: \"kubernetes.io/projected/f0796537-9ea0-42b5-9701-04487a4ca241-kube-api-access-xngcn\") pod \"f0796537-9ea0-42b5-9701-04487a4ca241\" (UID: \"f0796537-9ea0-42b5-9701-04487a4ca241\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.098122 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvknw\" (UniqueName: \"kubernetes.io/projected/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-kube-api-access-qvknw\") pod \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\" (UID: \"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.098612 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0796537-9ea0-42b5-9701-04487a4ca241-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0796537-9ea0-42b5-9701-04487a4ca241" (UID: "f0796537-9ea0-42b5-9701-04487a4ca241"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.099076 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" (UID: "9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.104826 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-kube-api-access-qvknw" (OuterVolumeSpecName: "kube-api-access-qvknw") pod "9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" (UID: "9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a"). InnerVolumeSpecName "kube-api-access-qvknw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.121720 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0796537-9ea0-42b5-9701-04487a4ca241-kube-api-access-xngcn" (OuterVolumeSpecName: "kube-api-access-xngcn") pod "f0796537-9ea0-42b5-9701-04487a4ca241" (UID: "f0796537-9ea0-42b5-9701-04487a4ca241"). InnerVolumeSpecName "kube-api-access-xngcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.199831 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvknw\" (UniqueName: \"kubernetes.io/projected/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-kube-api-access-qvknw\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.199871 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xngcn\" (UniqueName: \"kubernetes.io/projected/f0796537-9ea0-42b5-9701-04487a4ca241-kube-api-access-xngcn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.199882 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0796537-9ea0-42b5-9701-04487a4ca241-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.199893 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.461099 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z6jmg" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.468308 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z6jmg" event={"ID":"9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a","Type":"ContainerDied","Data":"999474c66069f784e35d9722d8967a6c503f85eb5dad4287bc5b53f336740fcb"} Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.468610 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="999474c66069f784e35d9722d8967a6c503f85eb5dad4287bc5b53f336740fcb" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.468721 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e236-account-create-update-n8sb9" event={"ID":"f0796537-9ea0-42b5-9701-04487a4ca241","Type":"ContainerDied","Data":"66fdcda8c3f81765e517cce096b20fd1fb0021bdda2eead69e5d42c2f3e54386"} Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.468828 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fdcda8c3f81765e517cce096b20fd1fb0021bdda2eead69e5d42c2f3e54386" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.468956 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e236-account-create-update-n8sb9" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.755355 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.889211 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.910102 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad3ed01-a668-4335-9254-46a2c1704e90-operator-scripts\") pod \"bad3ed01-a668-4335-9254-46a2c1704e90\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.910151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gccm\" (UniqueName: \"kubernetes.io/projected/bad3ed01-a668-4335-9254-46a2c1704e90-kube-api-access-9gccm\") pod \"bad3ed01-a668-4335-9254-46a2c1704e90\" (UID: \"bad3ed01-a668-4335-9254-46a2c1704e90\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.910195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7btfg\" (UniqueName: \"kubernetes.io/projected/fb1551b7-3a64-4400-b3b6-8b3e1334401e-kube-api-access-7btfg\") pod \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.911426 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad3ed01-a668-4335-9254-46a2c1704e90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bad3ed01-a668-4335-9254-46a2c1704e90" (UID: "bad3ed01-a668-4335-9254-46a2c1704e90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.921179 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1551b7-3a64-4400-b3b6-8b3e1334401e-kube-api-access-7btfg" (OuterVolumeSpecName: "kube-api-access-7btfg") pod "fb1551b7-3a64-4400-b3b6-8b3e1334401e" (UID: "fb1551b7-3a64-4400-b3b6-8b3e1334401e"). InnerVolumeSpecName "kube-api-access-7btfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.925211 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.925391 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad3ed01-a668-4335-9254-46a2c1704e90-kube-api-access-9gccm" (OuterVolumeSpecName: "kube-api-access-9gccm") pod "bad3ed01-a668-4335-9254-46a2c1704e90" (UID: "bad3ed01-a668-4335-9254-46a2c1704e90"). InnerVolumeSpecName "kube-api-access-9gccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:17 crc kubenswrapper[4776]: I1204 09:59:17.937411 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.011947 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1551b7-3a64-4400-b3b6-8b3e1334401e-operator-scripts\") pod \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\" (UID: \"fb1551b7-3a64-4400-b3b6-8b3e1334401e\") " Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.012350 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7btfg\" (UniqueName: \"kubernetes.io/projected/fb1551b7-3a64-4400-b3b6-8b3e1334401e-kube-api-access-7btfg\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.012369 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad3ed01-a668-4335-9254-46a2c1704e90-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.012378 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gccm\" (UniqueName: \"kubernetes.io/projected/bad3ed01-a668-4335-9254-46a2c1704e90-kube-api-access-9gccm\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.012468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1551b7-3a64-4400-b3b6-8b3e1334401e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb1551b7-3a64-4400-b3b6-8b3e1334401e" (UID: "fb1551b7-3a64-4400-b3b6-8b3e1334401e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.113244 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468626e9-c715-4f2d-bb1e-35f3ac706a17-operator-scripts\") pod \"468626e9-c715-4f2d-bb1e-35f3ac706a17\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.113317 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8t6w\" (UniqueName: \"kubernetes.io/projected/468626e9-c715-4f2d-bb1e-35f3ac706a17-kube-api-access-k8t6w\") pod \"468626e9-c715-4f2d-bb1e-35f3ac706a17\" (UID: \"468626e9-c715-4f2d-bb1e-35f3ac706a17\") " Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.113413 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbgb5\" (UniqueName: \"kubernetes.io/projected/c203a97c-dc5b-4a58-bb5c-f826221c87f3-kube-api-access-wbgb5\") pod \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.113440 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203a97c-dc5b-4a58-bb5c-f826221c87f3-operator-scripts\") pod \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\" (UID: \"c203a97c-dc5b-4a58-bb5c-f826221c87f3\") " Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.113865 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1551b7-3a64-4400-b3b6-8b3e1334401e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.114100 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c203a97c-dc5b-4a58-bb5c-f826221c87f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c203a97c-dc5b-4a58-bb5c-f826221c87f3" (UID: "c203a97c-dc5b-4a58-bb5c-f826221c87f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.114570 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468626e9-c715-4f2d-bb1e-35f3ac706a17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "468626e9-c715-4f2d-bb1e-35f3ac706a17" (UID: "468626e9-c715-4f2d-bb1e-35f3ac706a17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.117599 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c203a97c-dc5b-4a58-bb5c-f826221c87f3-kube-api-access-wbgb5" (OuterVolumeSpecName: "kube-api-access-wbgb5") pod "c203a97c-dc5b-4a58-bb5c-f826221c87f3" (UID: "c203a97c-dc5b-4a58-bb5c-f826221c87f3"). InnerVolumeSpecName "kube-api-access-wbgb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.122058 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468626e9-c715-4f2d-bb1e-35f3ac706a17-kube-api-access-k8t6w" (OuterVolumeSpecName: "kube-api-access-k8t6w") pod "468626e9-c715-4f2d-bb1e-35f3ac706a17" (UID: "468626e9-c715-4f2d-bb1e-35f3ac706a17"). InnerVolumeSpecName "kube-api-access-k8t6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.215163 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbgb5\" (UniqueName: \"kubernetes.io/projected/c203a97c-dc5b-4a58-bb5c-f826221c87f3-kube-api-access-wbgb5\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.215198 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203a97c-dc5b-4a58-bb5c-f826221c87f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.215209 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468626e9-c715-4f2d-bb1e-35f3ac706a17-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.215220 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8t6w\" (UniqueName: \"kubernetes.io/projected/468626e9-c715-4f2d-bb1e-35f3ac706a17-kube-api-access-k8t6w\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.482878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e29f-account-create-update-qpv5v" event={"ID":"468626e9-c715-4f2d-bb1e-35f3ac706a17","Type":"ContainerDied","Data":"826e29d1abdf0aefb0d68089cd5e9cd584115e90fb62541a7d560249a4148d8c"} Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.482932 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="826e29d1abdf0aefb0d68089cd5e9cd584115e90fb62541a7d560249a4148d8c" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.483013 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e29f-account-create-update-qpv5v" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.489757 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e717-account-create-update-qc9bj" event={"ID":"bad3ed01-a668-4335-9254-46a2c1704e90","Type":"ContainerDied","Data":"60c310a6ae8518651c1604e34d0023e7c93ecead6d4a1face2b66e07f4d1f0d5"} Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.489811 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c310a6ae8518651c1604e34d0023e7c93ecead6d4a1face2b66e07f4d1f0d5" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.489931 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e717-account-create-update-qc9bj" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.494737 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jdlgz" event={"ID":"fb1551b7-3a64-4400-b3b6-8b3e1334401e","Type":"ContainerDied","Data":"24ca719a906051ec2d37965d52f25285963d82481e5aec13b480bd9e3d13d3f7"} Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.494784 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24ca719a906051ec2d37965d52f25285963d82481e5aec13b480bd9e3d13d3f7" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.494816 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdlgz" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.499017 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gh7dv" event={"ID":"c203a97c-dc5b-4a58-bb5c-f826221c87f3","Type":"ContainerDied","Data":"7782d938fd9ce3281ebd5288d7c13c5c897299bff6c0a0891db5125db2b4d794"} Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.499052 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7782d938fd9ce3281ebd5288d7c13c5c897299bff6c0a0891db5125db2b4d794" Dec 04 09:59:18 crc kubenswrapper[4776]: I1204 09:59:18.499131 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gh7dv" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.434793 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6vlkr"] Dec 04 09:59:19 crc kubenswrapper[4776]: E1204 09:59:19.436121 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c203a97c-dc5b-4a58-bb5c-f826221c87f3" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436143 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c203a97c-dc5b-4a58-bb5c-f826221c87f3" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: E1204 09:59:19.436158 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad3ed01-a668-4335-9254-46a2c1704e90" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436164 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad3ed01-a668-4335-9254-46a2c1704e90" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: E1204 09:59:19.436179 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468626e9-c715-4f2d-bb1e-35f3ac706a17" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436185 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="468626e9-c715-4f2d-bb1e-35f3ac706a17" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: E1204 09:59:19.436193 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436199 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: E1204 09:59:19.436221 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1551b7-3a64-4400-b3b6-8b3e1334401e" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436227 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1551b7-3a64-4400-b3b6-8b3e1334401e" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: E1204 09:59:19.436241 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0796537-9ea0-42b5-9701-04487a4ca241" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436246 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0796537-9ea0-42b5-9701-04487a4ca241" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436400 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0796537-9ea0-42b5-9701-04487a4ca241" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436415 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436425 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="468626e9-c715-4f2d-bb1e-35f3ac706a17" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436431 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c203a97c-dc5b-4a58-bb5c-f826221c87f3" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436441 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1551b7-3a64-4400-b3b6-8b3e1334401e" containerName="mariadb-database-create" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436452 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad3ed01-a668-4335-9254-46a2c1704e90" containerName="mariadb-account-create-update" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.436996 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.439267 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2jbsq" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.439470 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.463903 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6vlkr"] Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.536474 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-config-data\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.536547 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wdqr\" (UniqueName: \"kubernetes.io/projected/c93c4643-8ac8-4063-b35f-b6de695f42dd-kube-api-access-2wdqr\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.536677 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-combined-ca-bundle\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.536739 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-db-sync-config-data\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.638646 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-combined-ca-bundle\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.638774 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-db-sync-config-data\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.638838 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-config-data\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.638866 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wdqr\" (UniqueName: \"kubernetes.io/projected/c93c4643-8ac8-4063-b35f-b6de695f42dd-kube-api-access-2wdqr\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.643331 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-combined-ca-bundle\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.643361 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-config-data\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.643749 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-db-sync-config-data\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.657673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wdqr\" (UniqueName: \"kubernetes.io/projected/c93c4643-8ac8-4063-b35f-b6de695f42dd-kube-api-access-2wdqr\") pod \"glance-db-sync-6vlkr\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.744248 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:59:19 crc kubenswrapper[4776]: I1204 09:59:19.758333 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:20 crc kubenswrapper[4776]: I1204 09:59:20.016319 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tchdq" podUID="1100839e-9cfb-4361-a653-321d0d431072" containerName="ovn-controller" probeResult="failure" output=< Dec 04 09:59:20 crc kubenswrapper[4776]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 09:59:20 crc kubenswrapper[4776]: > Dec 04 09:59:20 crc kubenswrapper[4776]: I1204 09:59:20.264761 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6vlkr"] Dec 04 09:59:20 crc kubenswrapper[4776]: I1204 09:59:20.518995 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6vlkr" event={"ID":"c93c4643-8ac8-4063-b35f-b6de695f42dd","Type":"ContainerStarted","Data":"b99d806d4624d55f640848821831217e0e40abb813f69d1454098e7d0952c2ea"} Dec 04 09:59:20 crc kubenswrapper[4776]: I1204 09:59:20.521272 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerID="b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c" exitCode=0 Dec 04 09:59:20 crc kubenswrapper[4776]: I1204 09:59:20.521311 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e","Type":"ContainerDied","Data":"b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c"} Dec 04 09:59:21 crc kubenswrapper[4776]: I1204 09:59:21.411247 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 09:59:21 crc kubenswrapper[4776]: I1204 09:59:21.538953 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e","Type":"ContainerStarted","Data":"7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d"} Dec 04 09:59:21 crc kubenswrapper[4776]: I1204 09:59:21.539821 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 09:59:21 crc kubenswrapper[4776]: I1204 09:59:21.574650 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371964.280146 podStartE2EDuration="1m12.574629268s" podCreationTimestamp="2025-12-04 09:58:09 +0000 UTC" firstStartedPulling="2025-12-04 09:58:12.092969183 +0000 UTC m=+1136.959449560" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:21.569208349 +0000 UTC m=+1206.435688726" watchObservedRunningTime="2025-12-04 09:59:21.574629268 +0000 UTC m=+1206.441109645" Dec 04 09:59:24 crc kubenswrapper[4776]: I1204 09:59:24.749625 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9mnct" Dec 04 09:59:24 crc kubenswrapper[4776]: I1204 09:59:24.975927 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tchdq-config-vrqfm"] Dec 04 09:59:24 crc kubenswrapper[4776]: I1204 09:59:24.977161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:24 crc kubenswrapper[4776]: I1204 09:59:24.982288 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046269 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046404 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-additional-scripts\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-log-ovn\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046517 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run-ovn\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046591 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-scripts\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046643 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89qz6\" (UniqueName: \"kubernetes.io/projected/dee69fac-c996-4bb5-b1e7-1366e3e45010-kube-api-access-89qz6\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.046852 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tchdq-config-vrqfm"] Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.051974 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tchdq" podUID="1100839e-9cfb-4361-a653-321d0d431072" containerName="ovn-controller" probeResult="failure" output=< Dec 04 09:59:25 crc kubenswrapper[4776]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 09:59:25 crc kubenswrapper[4776]: > Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.147888 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.147987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-additional-scripts\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.148024 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-log-ovn\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.148052 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run-ovn\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.148109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-scripts\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.148141 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89qz6\" (UniqueName: \"kubernetes.io/projected/dee69fac-c996-4bb5-b1e7-1366e3e45010-kube-api-access-89qz6\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.148813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.149087 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run-ovn\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.149185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-log-ovn\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.149577 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-additional-scripts\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.151798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-scripts\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.181765 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89qz6\" (UniqueName: \"kubernetes.io/projected/dee69fac-c996-4bb5-b1e7-1366e3e45010-kube-api-access-89qz6\") pod \"ovn-controller-tchdq-config-vrqfm\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.300347 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:25 crc kubenswrapper[4776]: I1204 09:59:25.881891 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tchdq-config-vrqfm"] Dec 04 09:59:25 crc kubenswrapper[4776]: W1204 09:59:25.887481 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee69fac_c996_4bb5_b1e7_1366e3e45010.slice/crio-304b498547abf24afef49189f8bb88f8f7ef907b564e3b2f20a196b8f2daf957 WatchSource:0}: Error finding container 304b498547abf24afef49189f8bb88f8f7ef907b564e3b2f20a196b8f2daf957: Status 404 returned error can't find the container with id 304b498547abf24afef49189f8bb88f8f7ef907b564e3b2f20a196b8f2daf957 Dec 04 09:59:26 crc kubenswrapper[4776]: I1204 09:59:26.598798 4776 generic.go:334] "Generic (PLEG): container finished" podID="dee69fac-c996-4bb5-b1e7-1366e3e45010" containerID="5a91c13bf2f6fd9c013eefb602b3f9ab04b60c36f0b1dd4440db61dff31235b8" exitCode=0 Dec 04 09:59:26 crc kubenswrapper[4776]: I1204 09:59:26.598979 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq-config-vrqfm" event={"ID":"dee69fac-c996-4bb5-b1e7-1366e3e45010","Type":"ContainerDied","Data":"5a91c13bf2f6fd9c013eefb602b3f9ab04b60c36f0b1dd4440db61dff31235b8"} Dec 04 09:59:26 crc kubenswrapper[4776]: I1204 09:59:26.599137 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq-config-vrqfm" event={"ID":"dee69fac-c996-4bb5-b1e7-1366e3e45010","Type":"ContainerStarted","Data":"304b498547abf24afef49189f8bb88f8f7ef907b564e3b2f20a196b8f2daf957"} Dec 04 09:59:30 crc kubenswrapper[4776]: I1204 09:59:30.019422 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-tchdq" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.025122 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.330206 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qzm7k"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.331749 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.354630 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qzm7k"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.375396 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7507bda5-9608-4cd8-b40b-f9e69a06d41c-operator-scripts\") pod \"cinder-db-create-qzm7k\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.375578 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xr8n\" (UniqueName: \"kubernetes.io/projected/7507bda5-9608-4cd8-b40b-f9e69a06d41c-kube-api-access-9xr8n\") pod \"cinder-db-create-qzm7k\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.442170 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-trxt7"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.443493 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.466349 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-245b-account-create-update-pckpb"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.467491 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-trxt7"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.467583 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.477009 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xr8n\" (UniqueName: \"kubernetes.io/projected/7507bda5-9608-4cd8-b40b-f9e69a06d41c-kube-api-access-9xr8n\") pod \"cinder-db-create-qzm7k\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.477095 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7507bda5-9608-4cd8-b40b-f9e69a06d41c-operator-scripts\") pod \"cinder-db-create-qzm7k\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.477953 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7507bda5-9608-4cd8-b40b-f9e69a06d41c-operator-scripts\") pod \"cinder-db-create-qzm7k\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.484214 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.503292 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-245b-account-create-update-pckpb"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.527618 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xr8n\" (UniqueName: \"kubernetes.io/projected/7507bda5-9608-4cd8-b40b-f9e69a06d41c-kube-api-access-9xr8n\") pod \"cinder-db-create-qzm7k\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.556354 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0a78-account-create-update-77b4l"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.557436 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.560481 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.573678 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a78-account-create-update-77b4l"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.578620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnqhj\" (UniqueName: \"kubernetes.io/projected/718a3f24-cae1-449e-b979-0058c19dbe4b-kube-api-access-rnqhj\") pod \"barbican-245b-account-create-update-pckpb\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.578666 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd69365-f194-4607-ba00-f17ed2acbdb9-operator-scripts\") pod \"barbican-db-create-trxt7\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.578727 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718a3f24-cae1-449e-b979-0058c19dbe4b-operator-scripts\") pod \"barbican-245b-account-create-update-pckpb\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.578821 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8d4p\" (UniqueName: \"kubernetes.io/projected/dcd69365-f194-4607-ba00-f17ed2acbdb9-kube-api-access-f8d4p\") pod \"barbican-db-create-trxt7\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.637957 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2lscm"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.639046 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.650308 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2lscm"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.650676 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.680780 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnqhj\" (UniqueName: \"kubernetes.io/projected/718a3f24-cae1-449e-b979-0058c19dbe4b-kube-api-access-rnqhj\") pod \"barbican-245b-account-create-update-pckpb\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.680844 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd69365-f194-4607-ba00-f17ed2acbdb9-operator-scripts\") pod \"barbican-db-create-trxt7\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.680944 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718a3f24-cae1-449e-b979-0058c19dbe4b-operator-scripts\") pod \"barbican-245b-account-create-update-pckpb\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.680990 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8d4p\" (UniqueName: \"kubernetes.io/projected/dcd69365-f194-4607-ba00-f17ed2acbdb9-kube-api-access-f8d4p\") pod \"barbican-db-create-trxt7\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.681039 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f4935bc-a17c-4ded-b454-21eb494550e5-operator-scripts\") pod \"cinder-0a78-account-create-update-77b4l\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.681084 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71086560-4754-4032-b819-35a007beb5fd-operator-scripts\") pod \"neutron-db-create-2lscm\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.681134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqqgn\" (UniqueName: \"kubernetes.io/projected/71086560-4754-4032-b819-35a007beb5fd-kube-api-access-bqqgn\") pod \"neutron-db-create-2lscm\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.681193 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5rv\" (UniqueName: \"kubernetes.io/projected/7f4935bc-a17c-4ded-b454-21eb494550e5-kube-api-access-fq5rv\") pod \"cinder-0a78-account-create-update-77b4l\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.682302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd69365-f194-4607-ba00-f17ed2acbdb9-operator-scripts\") pod \"barbican-db-create-trxt7\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.683089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718a3f24-cae1-449e-b979-0058c19dbe4b-operator-scripts\") pod \"barbican-245b-account-create-update-pckpb\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.707068 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnqhj\" (UniqueName: \"kubernetes.io/projected/718a3f24-cae1-449e-b979-0058c19dbe4b-kube-api-access-rnqhj\") pod \"barbican-245b-account-create-update-pckpb\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.717997 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8d4p\" (UniqueName: \"kubernetes.io/projected/dcd69365-f194-4607-ba00-f17ed2acbdb9-kube-api-access-f8d4p\") pod \"barbican-db-create-trxt7\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.722574 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p2lqk"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.723556 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: W1204 09:59:31.727509 4776 reflector.go:561] object-"openstack"/"keystone": failed to list *v1.Secret: secrets "keystone" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 04 09:59:31 crc kubenswrapper[4776]: E1204 09:59:31.727550 4776 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 09:59:31 crc kubenswrapper[4776]: W1204 09:59:31.727587 4776 reflector.go:561] object-"openstack"/"keystone-config-data": failed to list *v1.Secret: secrets "keystone-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 04 09:59:31 crc kubenswrapper[4776]: E1204 09:59:31.727597 4776 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 09:59:31 crc kubenswrapper[4776]: W1204 09:59:31.727639 4776 reflector.go:561] object-"openstack"/"keystone-keystone-dockercfg-cwwkl": failed to list *v1.Secret: secrets "keystone-keystone-dockercfg-cwwkl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 04 09:59:31 crc kubenswrapper[4776]: E1204 09:59:31.727649 4776 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-keystone-dockercfg-cwwkl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-keystone-dockercfg-cwwkl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 09:59:31 crc kubenswrapper[4776]: W1204 09:59:31.727686 4776 reflector.go:561] object-"openstack"/"keystone-scripts": failed to list *v1.Secret: secrets "keystone-scripts" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 04 09:59:31 crc kubenswrapper[4776]: E1204 09:59:31.727695 4776 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"keystone-scripts\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.748225 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p2lqk"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.763359 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.768646 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c65e-account-create-update-8tkss"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.769950 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.775566 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5rv\" (UniqueName: \"kubernetes.io/projected/7f4935bc-a17c-4ded-b454-21eb494550e5-kube-api-access-fq5rv\") pod \"cinder-0a78-account-create-update-77b4l\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783111 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-combined-ca-bundle\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783159 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px26c\" (UniqueName: \"kubernetes.io/projected/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-kube-api-access-px26c\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783185 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f4935bc-a17c-4ded-b454-21eb494550e5-operator-scripts\") pod \"cinder-0a78-account-create-update-77b4l\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783221 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71086560-4754-4032-b819-35a007beb5fd-operator-scripts\") pod \"neutron-db-create-2lscm\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.783268 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqqgn\" (UniqueName: \"kubernetes.io/projected/71086560-4754-4032-b819-35a007beb5fd-kube-api-access-bqqgn\") pod \"neutron-db-create-2lscm\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.790161 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c65e-account-create-update-8tkss"] Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.791364 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f4935bc-a17c-4ded-b454-21eb494550e5-operator-scripts\") pod \"cinder-0a78-account-create-update-77b4l\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.793439 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71086560-4754-4032-b819-35a007beb5fd-operator-scripts\") pod \"neutron-db-create-2lscm\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.825028 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.841801 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqqgn\" (UniqueName: \"kubernetes.io/projected/71086560-4754-4032-b819-35a007beb5fd-kube-api-access-bqqgn\") pod \"neutron-db-create-2lscm\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.839899 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5rv\" (UniqueName: \"kubernetes.io/projected/7f4935bc-a17c-4ded-b454-21eb494550e5-kube-api-access-fq5rv\") pod \"cinder-0a78-account-create-update-77b4l\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.883239 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.885738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5hcx\" (UniqueName: \"kubernetes.io/projected/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-kube-api-access-x5hcx\") pod \"neutron-c65e-account-create-update-8tkss\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.885867 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-combined-ca-bundle\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.885922 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px26c\" (UniqueName: \"kubernetes.io/projected/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-kube-api-access-px26c\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.885951 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-operator-scripts\") pod \"neutron-c65e-account-create-update-8tkss\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.885985 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.890767 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-combined-ca-bundle\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.903503 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px26c\" (UniqueName: \"kubernetes.io/projected/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-kube-api-access-px26c\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.957523 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.988103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-operator-scripts\") pod \"neutron-c65e-account-create-update-8tkss\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.988698 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5hcx\" (UniqueName: \"kubernetes.io/projected/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-kube-api-access-x5hcx\") pod \"neutron-c65e-account-create-update-8tkss\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:31 crc kubenswrapper[4776]: I1204 09:59:31.989564 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-operator-scripts\") pod \"neutron-c65e-account-create-update-8tkss\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:32 crc kubenswrapper[4776]: I1204 09:59:32.010604 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5hcx\" (UniqueName: \"kubernetes.io/projected/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-kube-api-access-x5hcx\") pod \"neutron-c65e-account-create-update-8tkss\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:32 crc kubenswrapper[4776]: I1204 09:59:32.185462 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:32 crc kubenswrapper[4776]: I1204 09:59:32.578064 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cwwkl" Dec 04 09:59:32 crc kubenswrapper[4776]: I1204 09:59:32.813422 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 09:59:32 crc kubenswrapper[4776]: I1204 09:59:32.823240 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 09:59:32 crc kubenswrapper[4776]: E1204 09:59:32.886960 4776 secret.go:188] Couldn't get secret openstack/keystone-config-data: failed to sync secret cache: timed out waiting for the condition Dec 04 09:59:32 crc kubenswrapper[4776]: E1204 09:59:32.887069 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data podName:eb9bffe6-0182-47a3-b5f6-86297c6f5c92 nodeName:}" failed. No retries permitted until 2025-12-04 09:59:33.387040675 +0000 UTC m=+1218.253521052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data") pod "keystone-db-sync-p2lqk" (UID: "eb9bffe6-0182-47a3-b5f6-86297c6f5c92") : failed to sync secret cache: timed out waiting for the condition Dec 04 09:59:32 crc kubenswrapper[4776]: I1204 09:59:32.958580 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 09:59:33 crc kubenswrapper[4776]: I1204 09:59:33.413672 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:33 crc kubenswrapper[4776]: I1204 09:59:33.418909 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data\") pod \"keystone-db-sync-p2lqk\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:33 crc kubenswrapper[4776]: I1204 09:59:33.675993 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.611851 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.733836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq-config-vrqfm" event={"ID":"dee69fac-c996-4bb5-b1e7-1366e3e45010","Type":"ContainerDied","Data":"304b498547abf24afef49189f8bb88f8f7ef907b564e3b2f20a196b8f2daf957"} Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.733879 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="304b498547abf24afef49189f8bb88f8f7ef907b564e3b2f20a196b8f2daf957" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.733952 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-vrqfm" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.778440 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89qz6\" (UniqueName: \"kubernetes.io/projected/dee69fac-c996-4bb5-b1e7-1366e3e45010-kube-api-access-89qz6\") pod \"dee69fac-c996-4bb5-b1e7-1366e3e45010\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.779076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-additional-scripts\") pod \"dee69fac-c996-4bb5-b1e7-1366e3e45010\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.779114 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run\") pod \"dee69fac-c996-4bb5-b1e7-1366e3e45010\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.779237 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run-ovn\") pod \"dee69fac-c996-4bb5-b1e7-1366e3e45010\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.779292 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-log-ovn\") pod \"dee69fac-c996-4bb5-b1e7-1366e3e45010\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.779339 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-scripts\") pod \"dee69fac-c996-4bb5-b1e7-1366e3e45010\" (UID: \"dee69fac-c996-4bb5-b1e7-1366e3e45010\") " Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.779517 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run" (OuterVolumeSpecName: "var-run") pod "dee69fac-c996-4bb5-b1e7-1366e3e45010" (UID: "dee69fac-c996-4bb5-b1e7-1366e3e45010"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.780248 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "dee69fac-c996-4bb5-b1e7-1366e3e45010" (UID: "dee69fac-c996-4bb5-b1e7-1366e3e45010"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.780302 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "dee69fac-c996-4bb5-b1e7-1366e3e45010" (UID: "dee69fac-c996-4bb5-b1e7-1366e3e45010"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.780328 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "dee69fac-c996-4bb5-b1e7-1366e3e45010" (UID: "dee69fac-c996-4bb5-b1e7-1366e3e45010"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.782546 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-scripts" (OuterVolumeSpecName: "scripts") pod "dee69fac-c996-4bb5-b1e7-1366e3e45010" (UID: "dee69fac-c996-4bb5-b1e7-1366e3e45010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.789324 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee69fac-c996-4bb5-b1e7-1366e3e45010-kube-api-access-89qz6" (OuterVolumeSpecName: "kube-api-access-89qz6") pod "dee69fac-c996-4bb5-b1e7-1366e3e45010" (UID: "dee69fac-c996-4bb5-b1e7-1366e3e45010"). InnerVolumeSpecName "kube-api-access-89qz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.793225 4776 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.793266 4776 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.793299 4776 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.793313 4776 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee69fac-c996-4bb5-b1e7-1366e3e45010-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.793325 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee69fac-c996-4bb5-b1e7-1366e3e45010-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:35 crc kubenswrapper[4776]: I1204 09:59:35.793337 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89qz6\" (UniqueName: \"kubernetes.io/projected/dee69fac-c996-4bb5-b1e7-1366e3e45010-kube-api-access-89qz6\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.194078 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p2lqk"] Dec 04 09:59:36 crc kubenswrapper[4776]: W1204 09:59:36.200035 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9bffe6_0182_47a3_b5f6_86297c6f5c92.slice/crio-2ae36e58d0b41ac3cb8cc5cc45fd7724ff450b727a213acd8371c5967ac83c66 WatchSource:0}: Error finding container 2ae36e58d0b41ac3cb8cc5cc45fd7724ff450b727a213acd8371c5967ac83c66: Status 404 returned error can't find the container with id 2ae36e58d0b41ac3cb8cc5cc45fd7724ff450b727a213acd8371c5967ac83c66 Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.293313 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-245b-account-create-update-pckpb"] Dec 04 09:59:36 crc kubenswrapper[4776]: W1204 09:59:36.295853 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7507bda5_9608_4cd8_b40b_f9e69a06d41c.slice/crio-db79b6b36185dc92aeff026a862c63d4a9ac2867a8f280f8dd4e4454a80133af WatchSource:0}: Error finding container db79b6b36185dc92aeff026a862c63d4a9ac2867a8f280f8dd4e4454a80133af: Status 404 returned error can't find the container with id db79b6b36185dc92aeff026a862c63d4a9ac2867a8f280f8dd4e4454a80133af Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.304542 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qzm7k"] Dec 04 09:59:36 crc kubenswrapper[4776]: W1204 09:59:36.306569 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod718a3f24_cae1_449e_b979_0058c19dbe4b.slice/crio-43ebc641a984f01f8ad505ba932ced9c53e75ba381cecaa924d31173e1e066ee WatchSource:0}: Error finding container 43ebc641a984f01f8ad505ba932ced9c53e75ba381cecaa924d31173e1e066ee: Status 404 returned error can't find the container with id 43ebc641a984f01f8ad505ba932ced9c53e75ba381cecaa924d31173e1e066ee Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.398719 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c65e-account-create-update-8tkss"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.411720 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2lscm"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.424775 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a78-account-create-update-77b4l"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.436804 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-trxt7"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.724533 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-tchdq-config-vrqfm"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.732601 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-tchdq-config-vrqfm"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.748102 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-trxt7" event={"ID":"dcd69365-f194-4607-ba00-f17ed2acbdb9","Type":"ContainerStarted","Data":"f640896ba92c796bf5a1421a589dd69bc7bdb33743b249a5f7c59426c896528b"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.754864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c65e-account-create-update-8tkss" event={"ID":"6c7dcd00-07c0-4e29-b608-e43d0e09cfba","Type":"ContainerStarted","Data":"3dbf68a8d0845943469a2fa0292fca2782fbe2e65b0e457b50abbe4db6ec38bc"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.756545 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2lscm" event={"ID":"71086560-4754-4032-b819-35a007beb5fd","Type":"ContainerStarted","Data":"afe64fc1b2cc1a75bf3006e6ed5a2b2358bc666f7cad6e08d57ee425d3b5c8fc"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.758821 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6vlkr" event={"ID":"c93c4643-8ac8-4063-b35f-b6de695f42dd","Type":"ContainerStarted","Data":"bf14193fa054783ce050dbba93b5a5a233448023726cfeecd16c960f32165542"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.762782 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a78-account-create-update-77b4l" event={"ID":"7f4935bc-a17c-4ded-b454-21eb494550e5","Type":"ContainerStarted","Data":"119f4bc0f1d67da1cee14fccbb1c9c63d33cbd176443eb7990ea557d64eb0533"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.768626 4776 generic.go:334] "Generic (PLEG): container finished" podID="7507bda5-9608-4cd8-b40b-f9e69a06d41c" containerID="215eb17baeaabf160ea79f4430a65564a741d6d7683d8872a40f6dc4a2e91e9e" exitCode=0 Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.768719 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qzm7k" event={"ID":"7507bda5-9608-4cd8-b40b-f9e69a06d41c","Type":"ContainerDied","Data":"215eb17baeaabf160ea79f4430a65564a741d6d7683d8872a40f6dc4a2e91e9e"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.768750 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qzm7k" event={"ID":"7507bda5-9608-4cd8-b40b-f9e69a06d41c","Type":"ContainerStarted","Data":"db79b6b36185dc92aeff026a862c63d4a9ac2867a8f280f8dd4e4454a80133af"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.777272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-245b-account-create-update-pckpb" event={"ID":"718a3f24-cae1-449e-b979-0058c19dbe4b","Type":"ContainerStarted","Data":"21cf6fe4b4c030a58288e7e8cb7ce5989b2ae45d534dc9813783295da995e03f"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.777326 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-245b-account-create-update-pckpb" event={"ID":"718a3f24-cae1-449e-b979-0058c19dbe4b","Type":"ContainerStarted","Data":"43ebc641a984f01f8ad505ba932ced9c53e75ba381cecaa924d31173e1e066ee"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.779089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2lqk" event={"ID":"eb9bffe6-0182-47a3-b5f6-86297c6f5c92","Type":"ContainerStarted","Data":"2ae36e58d0b41ac3cb8cc5cc45fd7724ff450b727a213acd8371c5967ac83c66"} Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.790503 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6vlkr" podStartSLOduration=2.34604854 podStartE2EDuration="17.790483192s" podCreationTimestamp="2025-12-04 09:59:19 +0000 UTC" firstStartedPulling="2025-12-04 09:59:20.282812825 +0000 UTC m=+1205.149293202" lastFinishedPulling="2025-12-04 09:59:35.727247477 +0000 UTC m=+1220.593727854" observedRunningTime="2025-12-04 09:59:36.790212993 +0000 UTC m=+1221.656693370" watchObservedRunningTime="2025-12-04 09:59:36.790483192 +0000 UTC m=+1221.656963569" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.826866 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-245b-account-create-update-pckpb" podStartSLOduration=5.826844649 podStartE2EDuration="5.826844649s" podCreationTimestamp="2025-12-04 09:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:36.810197808 +0000 UTC m=+1221.676678185" watchObservedRunningTime="2025-12-04 09:59:36.826844649 +0000 UTC m=+1221.693325016" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.851736 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tchdq-config-77gg2"] Dec 04 09:59:36 crc kubenswrapper[4776]: E1204 09:59:36.852424 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee69fac-c996-4bb5-b1e7-1366e3e45010" containerName="ovn-config" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.852454 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee69fac-c996-4bb5-b1e7-1366e3e45010" containerName="ovn-config" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.852691 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee69fac-c996-4bb5-b1e7-1366e3e45010" containerName="ovn-config" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.853563 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.864529 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.874670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tchdq-config-77gg2"] Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.927111 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run-ovn\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.927165 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.927187 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-additional-scripts\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.927209 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-scripts\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.927243 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svcw\" (UniqueName: \"kubernetes.io/projected/c2602204-68c3-475a-b689-23ec20a95b9c-kube-api-access-8svcw\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:36 crc kubenswrapper[4776]: I1204 09:59:36.927313 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-log-ovn\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029354 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-log-ovn\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029761 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run-ovn\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029789 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029805 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-additional-scripts\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029890 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-scripts\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029675 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-log-ovn\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029895 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run-ovn\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.029864 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.030595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-additional-scripts\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.032249 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-scripts\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.032465 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8svcw\" (UniqueName: \"kubernetes.io/projected/c2602204-68c3-475a-b689-23ec20a95b9c-kube-api-access-8svcw\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.055123 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svcw\" (UniqueName: \"kubernetes.io/projected/c2602204-68c3-475a-b689-23ec20a95b9c-kube-api-access-8svcw\") pod \"ovn-controller-tchdq-config-77gg2\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.223193 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.473030 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee69fac-c996-4bb5-b1e7-1366e3e45010" path="/var/lib/kubelet/pods/dee69fac-c996-4bb5-b1e7-1366e3e45010/volumes" Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.687136 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tchdq-config-77gg2"] Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.801160 4776 generic.go:334] "Generic (PLEG): container finished" podID="dcd69365-f194-4607-ba00-f17ed2acbdb9" containerID="51875faf3a5817f0207df4d4130509a0a1d19bdcedf23ab5db21c33708fedfe1" exitCode=0 Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.801303 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-trxt7" event={"ID":"dcd69365-f194-4607-ba00-f17ed2acbdb9","Type":"ContainerDied","Data":"51875faf3a5817f0207df4d4130509a0a1d19bdcedf23ab5db21c33708fedfe1"} Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.806527 4776 generic.go:334] "Generic (PLEG): container finished" podID="6c7dcd00-07c0-4e29-b608-e43d0e09cfba" containerID="13bf0549a19da56078d92bf35ceef7c3032c03631eb85ca68ff7670f3afcd526" exitCode=0 Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.806626 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c65e-account-create-update-8tkss" event={"ID":"6c7dcd00-07c0-4e29-b608-e43d0e09cfba","Type":"ContainerDied","Data":"13bf0549a19da56078d92bf35ceef7c3032c03631eb85ca68ff7670f3afcd526"} Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.809529 4776 generic.go:334] "Generic (PLEG): container finished" podID="71086560-4754-4032-b819-35a007beb5fd" containerID="d5e8e6fe56d7b7af2e687137706b3465707aad1a1e0d5a4f26267fca96eb8a92" exitCode=0 Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.809612 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2lscm" event={"ID":"71086560-4754-4032-b819-35a007beb5fd","Type":"ContainerDied","Data":"d5e8e6fe56d7b7af2e687137706b3465707aad1a1e0d5a4f26267fca96eb8a92"} Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.811843 4776 generic.go:334] "Generic (PLEG): container finished" podID="7f4935bc-a17c-4ded-b454-21eb494550e5" containerID="b8dc6e63c74e7238d0747cea357433abfe7422b4fe9b3446095739ad40e999e5" exitCode=0 Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.811893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a78-account-create-update-77b4l" event={"ID":"7f4935bc-a17c-4ded-b454-21eb494550e5","Type":"ContainerDied","Data":"b8dc6e63c74e7238d0747cea357433abfe7422b4fe9b3446095739ad40e999e5"} Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.817092 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq-config-77gg2" event={"ID":"c2602204-68c3-475a-b689-23ec20a95b9c","Type":"ContainerStarted","Data":"7d1bf13d40df21d37c87e8fe105296b28e48a85097960eb9db1c3de7fb53bf49"} Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.824268 4776 generic.go:334] "Generic (PLEG): container finished" podID="718a3f24-cae1-449e-b979-0058c19dbe4b" containerID="21cf6fe4b4c030a58288e7e8cb7ce5989b2ae45d534dc9813783295da995e03f" exitCode=0 Dec 04 09:59:37 crc kubenswrapper[4776]: I1204 09:59:37.824475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-245b-account-create-update-pckpb" event={"ID":"718a3f24-cae1-449e-b979-0058c19dbe4b","Type":"ContainerDied","Data":"21cf6fe4b4c030a58288e7e8cb7ce5989b2ae45d534dc9813783295da995e03f"} Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.215248 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.367065 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7507bda5-9608-4cd8-b40b-f9e69a06d41c-operator-scripts\") pod \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.367117 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xr8n\" (UniqueName: \"kubernetes.io/projected/7507bda5-9608-4cd8-b40b-f9e69a06d41c-kube-api-access-9xr8n\") pod \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\" (UID: \"7507bda5-9608-4cd8-b40b-f9e69a06d41c\") " Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.368468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7507bda5-9608-4cd8-b40b-f9e69a06d41c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7507bda5-9608-4cd8-b40b-f9e69a06d41c" (UID: "7507bda5-9608-4cd8-b40b-f9e69a06d41c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.376867 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7507bda5-9608-4cd8-b40b-f9e69a06d41c-kube-api-access-9xr8n" (OuterVolumeSpecName: "kube-api-access-9xr8n") pod "7507bda5-9608-4cd8-b40b-f9e69a06d41c" (UID: "7507bda5-9608-4cd8-b40b-f9e69a06d41c"). InnerVolumeSpecName "kube-api-access-9xr8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.468829 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7507bda5-9608-4cd8-b40b-f9e69a06d41c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.468865 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xr8n\" (UniqueName: \"kubernetes.io/projected/7507bda5-9608-4cd8-b40b-f9e69a06d41c-kube-api-access-9xr8n\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.837364 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qzm7k" event={"ID":"7507bda5-9608-4cd8-b40b-f9e69a06d41c","Type":"ContainerDied","Data":"db79b6b36185dc92aeff026a862c63d4a9ac2867a8f280f8dd4e4454a80133af"} Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.837736 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db79b6b36185dc92aeff026a862c63d4a9ac2867a8f280f8dd4e4454a80133af" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.837808 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qzm7k" Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.892049 4776 generic.go:334] "Generic (PLEG): container finished" podID="c2602204-68c3-475a-b689-23ec20a95b9c" containerID="8cb3c87de8875ddc8a3f922151f62c3a7acb6802f63308985a0ffda8f5f02f0e" exitCode=0 Dec 04 09:59:38 crc kubenswrapper[4776]: I1204 09:59:38.892591 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq-config-77gg2" event={"ID":"c2602204-68c3-475a-b689-23ec20a95b9c","Type":"ContainerDied","Data":"8cb3c87de8875ddc8a3f922151f62c3a7acb6802f63308985a0ffda8f5f02f0e"} Dec 04 09:59:41 crc kubenswrapper[4776]: I1204 09:59:41.924880 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-245b-account-create-update-pckpb" event={"ID":"718a3f24-cae1-449e-b979-0058c19dbe4b","Type":"ContainerDied","Data":"43ebc641a984f01f8ad505ba932ced9c53e75ba381cecaa924d31173e1e066ee"} Dec 04 09:59:41 crc kubenswrapper[4776]: I1204 09:59:41.925460 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ebc641a984f01f8ad505ba932ced9c53e75ba381cecaa924d31173e1e066ee" Dec 04 09:59:41 crc kubenswrapper[4776]: I1204 09:59:41.926381 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a78-account-create-update-77b4l" event={"ID":"7f4935bc-a17c-4ded-b454-21eb494550e5","Type":"ContainerDied","Data":"119f4bc0f1d67da1cee14fccbb1c9c63d33cbd176443eb7990ea557d64eb0533"} Dec 04 09:59:41 crc kubenswrapper[4776]: I1204 09:59:41.926411 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119f4bc0f1d67da1cee14fccbb1c9c63d33cbd176443eb7990ea557d64eb0533" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.091551 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.127288 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.157060 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.212187 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.222156 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.237029 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.241956 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f4935bc-a17c-4ded-b454-21eb494550e5-operator-scripts\") pod \"7f4935bc-a17c-4ded-b454-21eb494550e5\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.242987 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq5rv\" (UniqueName: \"kubernetes.io/projected/7f4935bc-a17c-4ded-b454-21eb494550e5-kube-api-access-fq5rv\") pod \"7f4935bc-a17c-4ded-b454-21eb494550e5\" (UID: \"7f4935bc-a17c-4ded-b454-21eb494550e5\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.243292 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718a3f24-cae1-449e-b979-0058c19dbe4b-operator-scripts\") pod \"718a3f24-cae1-449e-b979-0058c19dbe4b\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.243836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnqhj\" (UniqueName: \"kubernetes.io/projected/718a3f24-cae1-449e-b979-0058c19dbe4b-kube-api-access-rnqhj\") pod \"718a3f24-cae1-449e-b979-0058c19dbe4b\" (UID: \"718a3f24-cae1-449e-b979-0058c19dbe4b\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.242926 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f4935bc-a17c-4ded-b454-21eb494550e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f4935bc-a17c-4ded-b454-21eb494550e5" (UID: "7f4935bc-a17c-4ded-b454-21eb494550e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.245451 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/718a3f24-cae1-449e-b979-0058c19dbe4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "718a3f24-cae1-449e-b979-0058c19dbe4b" (UID: "718a3f24-cae1-449e-b979-0058c19dbe4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.248829 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4935bc-a17c-4ded-b454-21eb494550e5-kube-api-access-fq5rv" (OuterVolumeSpecName: "kube-api-access-fq5rv") pod "7f4935bc-a17c-4ded-b454-21eb494550e5" (UID: "7f4935bc-a17c-4ded-b454-21eb494550e5"). InnerVolumeSpecName "kube-api-access-fq5rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.270554 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718a3f24-cae1-449e-b979-0058c19dbe4b-kube-api-access-rnqhj" (OuterVolumeSpecName: "kube-api-access-rnqhj") pod "718a3f24-cae1-449e-b979-0058c19dbe4b" (UID: "718a3f24-cae1-449e-b979-0058c19dbe4b"). InnerVolumeSpecName "kube-api-access-rnqhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.349800 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run-ovn\") pod \"c2602204-68c3-475a-b689-23ec20a95b9c\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350105 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8d4p\" (UniqueName: \"kubernetes.io/projected/dcd69365-f194-4607-ba00-f17ed2acbdb9-kube-api-access-f8d4p\") pod \"dcd69365-f194-4607-ba00-f17ed2acbdb9\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350232 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71086560-4754-4032-b819-35a007beb5fd-operator-scripts\") pod \"71086560-4754-4032-b819-35a007beb5fd\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350337 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8svcw\" (UniqueName: \"kubernetes.io/projected/c2602204-68c3-475a-b689-23ec20a95b9c-kube-api-access-8svcw\") pod \"c2602204-68c3-475a-b689-23ec20a95b9c\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350425 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-additional-scripts\") pod \"c2602204-68c3-475a-b689-23ec20a95b9c\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run\") pod \"c2602204-68c3-475a-b689-23ec20a95b9c\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350701 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqqgn\" (UniqueName: \"kubernetes.io/projected/71086560-4754-4032-b819-35a007beb5fd-kube-api-access-bqqgn\") pod \"71086560-4754-4032-b819-35a007beb5fd\" (UID: \"71086560-4754-4032-b819-35a007beb5fd\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.350800 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd69365-f194-4607-ba00-f17ed2acbdb9-operator-scripts\") pod \"dcd69365-f194-4607-ba00-f17ed2acbdb9\" (UID: \"dcd69365-f194-4607-ba00-f17ed2acbdb9\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.351014 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-log-ovn\") pod \"c2602204-68c3-475a-b689-23ec20a95b9c\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.351198 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5hcx\" (UniqueName: \"kubernetes.io/projected/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-kube-api-access-x5hcx\") pod \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.351331 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-scripts\") pod \"c2602204-68c3-475a-b689-23ec20a95b9c\" (UID: \"c2602204-68c3-475a-b689-23ec20a95b9c\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.351437 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-operator-scripts\") pod \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\" (UID: \"6c7dcd00-07c0-4e29-b608-e43d0e09cfba\") " Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.351942 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f4935bc-a17c-4ded-b454-21eb494550e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.352025 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq5rv\" (UniqueName: \"kubernetes.io/projected/7f4935bc-a17c-4ded-b454-21eb494550e5-kube-api-access-fq5rv\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.352112 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718a3f24-cae1-449e-b979-0058c19dbe4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.352191 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnqhj\" (UniqueName: \"kubernetes.io/projected/718a3f24-cae1-449e-b979-0058c19dbe4b-kube-api-access-rnqhj\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.355189 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c7dcd00-07c0-4e29-b608-e43d0e09cfba" (UID: "6c7dcd00-07c0-4e29-b608-e43d0e09cfba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.355366 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c2602204-68c3-475a-b689-23ec20a95b9c" (UID: "c2602204-68c3-475a-b689-23ec20a95b9c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.359349 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd69365-f194-4607-ba00-f17ed2acbdb9-kube-api-access-f8d4p" (OuterVolumeSpecName: "kube-api-access-f8d4p") pod "dcd69365-f194-4607-ba00-f17ed2acbdb9" (UID: "dcd69365-f194-4607-ba00-f17ed2acbdb9"). InnerVolumeSpecName "kube-api-access-f8d4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.360339 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71086560-4754-4032-b819-35a007beb5fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71086560-4754-4032-b819-35a007beb5fd" (UID: "71086560-4754-4032-b819-35a007beb5fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.365101 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd69365-f194-4607-ba00-f17ed2acbdb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcd69365-f194-4607-ba00-f17ed2acbdb9" (UID: "dcd69365-f194-4607-ba00-f17ed2acbdb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.366318 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c2602204-68c3-475a-b689-23ec20a95b9c" (UID: "c2602204-68c3-475a-b689-23ec20a95b9c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.366469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run" (OuterVolumeSpecName: "var-run") pod "c2602204-68c3-475a-b689-23ec20a95b9c" (UID: "c2602204-68c3-475a-b689-23ec20a95b9c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.373038 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c2602204-68c3-475a-b689-23ec20a95b9c" (UID: "c2602204-68c3-475a-b689-23ec20a95b9c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.373844 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-scripts" (OuterVolumeSpecName: "scripts") pod "c2602204-68c3-475a-b689-23ec20a95b9c" (UID: "c2602204-68c3-475a-b689-23ec20a95b9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.383599 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2602204-68c3-475a-b689-23ec20a95b9c-kube-api-access-8svcw" (OuterVolumeSpecName: "kube-api-access-8svcw") pod "c2602204-68c3-475a-b689-23ec20a95b9c" (UID: "c2602204-68c3-475a-b689-23ec20a95b9c"). InnerVolumeSpecName "kube-api-access-8svcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.384011 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71086560-4754-4032-b819-35a007beb5fd-kube-api-access-bqqgn" (OuterVolumeSpecName: "kube-api-access-bqqgn") pod "71086560-4754-4032-b819-35a007beb5fd" (UID: "71086560-4754-4032-b819-35a007beb5fd"). InnerVolumeSpecName "kube-api-access-bqqgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.385557 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-kube-api-access-x5hcx" (OuterVolumeSpecName: "kube-api-access-x5hcx") pod "6c7dcd00-07c0-4e29-b608-e43d0e09cfba" (UID: "6c7dcd00-07c0-4e29-b608-e43d0e09cfba"). InnerVolumeSpecName "kube-api-access-x5hcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453539 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8d4p\" (UniqueName: \"kubernetes.io/projected/dcd69365-f194-4607-ba00-f17ed2acbdb9-kube-api-access-f8d4p\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453575 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71086560-4754-4032-b819-35a007beb5fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453589 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8svcw\" (UniqueName: \"kubernetes.io/projected/c2602204-68c3-475a-b689-23ec20a95b9c-kube-api-access-8svcw\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453600 4776 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453611 4776 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453623 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqqgn\" (UniqueName: \"kubernetes.io/projected/71086560-4754-4032-b819-35a007beb5fd-kube-api-access-bqqgn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453633 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd69365-f194-4607-ba00-f17ed2acbdb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453644 4776 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453666 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5hcx\" (UniqueName: \"kubernetes.io/projected/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-kube-api-access-x5hcx\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453676 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2602204-68c3-475a-b689-23ec20a95b9c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453686 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7dcd00-07c0-4e29-b608-e43d0e09cfba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.453696 4776 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2602204-68c3-475a-b689-23ec20a95b9c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.940583 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-trxt7" event={"ID":"dcd69365-f194-4607-ba00-f17ed2acbdb9","Type":"ContainerDied","Data":"f640896ba92c796bf5a1421a589dd69bc7bdb33743b249a5f7c59426c896528b"} Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.940805 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f640896ba92c796bf5a1421a589dd69bc7bdb33743b249a5f7c59426c896528b" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.940620 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-trxt7" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.943945 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c65e-account-create-update-8tkss" event={"ID":"6c7dcd00-07c0-4e29-b608-e43d0e09cfba","Type":"ContainerDied","Data":"3dbf68a8d0845943469a2fa0292fca2782fbe2e65b0e457b50abbe4db6ec38bc"} Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.944088 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dbf68a8d0845943469a2fa0292fca2782fbe2e65b0e457b50abbe4db6ec38bc" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.944231 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c65e-account-create-update-8tkss" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.951363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2lscm" event={"ID":"71086560-4754-4032-b819-35a007beb5fd","Type":"ContainerDied","Data":"afe64fc1b2cc1a75bf3006e6ed5a2b2358bc666f7cad6e08d57ee425d3b5c8fc"} Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.951662 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe64fc1b2cc1a75bf3006e6ed5a2b2358bc666f7cad6e08d57ee425d3b5c8fc" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.951746 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2lscm" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.956008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tchdq-config-77gg2" event={"ID":"c2602204-68c3-475a-b689-23ec20a95b9c","Type":"ContainerDied","Data":"7d1bf13d40df21d37c87e8fe105296b28e48a85097960eb9db1c3de7fb53bf49"} Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.956056 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d1bf13d40df21d37c87e8fe105296b28e48a85097960eb9db1c3de7fb53bf49" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.956133 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tchdq-config-77gg2" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.963365 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a78-account-create-update-77b4l" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.964957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2lqk" event={"ID":"eb9bffe6-0182-47a3-b5f6-86297c6f5c92","Type":"ContainerStarted","Data":"5729bc78d7877fa1b23598e87af657f7d2b962fb77203d5b729c4544718c5afb"} Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.965104 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-245b-account-create-update-pckpb" Dec 04 09:59:42 crc kubenswrapper[4776]: I1204 09:59:42.988080 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p2lqk" podStartSLOduration=6.218289673 podStartE2EDuration="11.988059654s" podCreationTimestamp="2025-12-04 09:59:31 +0000 UTC" firstStartedPulling="2025-12-04 09:59:36.203944217 +0000 UTC m=+1221.070424594" lastFinishedPulling="2025-12-04 09:59:41.973714198 +0000 UTC m=+1226.840194575" observedRunningTime="2025-12-04 09:59:42.980519698 +0000 UTC m=+1227.847000085" watchObservedRunningTime="2025-12-04 09:59:42.988059654 +0000 UTC m=+1227.854540031" Dec 04 09:59:43 crc kubenswrapper[4776]: I1204 09:59:43.412524 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-tchdq-config-77gg2"] Dec 04 09:59:43 crc kubenswrapper[4776]: I1204 09:59:43.420158 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-tchdq-config-77gg2"] Dec 04 09:59:43 crc kubenswrapper[4776]: I1204 09:59:43.463509 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2602204-68c3-475a-b689-23ec20a95b9c" path="/var/lib/kubelet/pods/c2602204-68c3-475a-b689-23ec20a95b9c/volumes" Dec 04 09:59:45 crc kubenswrapper[4776]: I1204 09:59:45.987792 4776 generic.go:334] "Generic (PLEG): container finished" podID="c93c4643-8ac8-4063-b35f-b6de695f42dd" containerID="bf14193fa054783ce050dbba93b5a5a233448023726cfeecd16c960f32165542" exitCode=0 Dec 04 09:59:45 crc kubenswrapper[4776]: I1204 09:59:45.987875 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6vlkr" event={"ID":"c93c4643-8ac8-4063-b35f-b6de695f42dd","Type":"ContainerDied","Data":"bf14193fa054783ce050dbba93b5a5a233448023726cfeecd16c960f32165542"} Dec 04 09:59:45 crc kubenswrapper[4776]: I1204 09:59:45.990753 4776 generic.go:334] "Generic (PLEG): container finished" podID="eb9bffe6-0182-47a3-b5f6-86297c6f5c92" containerID="5729bc78d7877fa1b23598e87af657f7d2b962fb77203d5b729c4544718c5afb" exitCode=0 Dec 04 09:59:45 crc kubenswrapper[4776]: I1204 09:59:45.990820 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2lqk" event={"ID":"eb9bffe6-0182-47a3-b5f6-86297c6f5c92","Type":"ContainerDied","Data":"5729bc78d7877fa1b23598e87af657f7d2b962fb77203d5b729c4544718c5afb"} Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.452513 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.477811 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.542716 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px26c\" (UniqueName: \"kubernetes.io/projected/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-kube-api-access-px26c\") pod \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.542795 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data\") pod \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.542907 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-combined-ca-bundle\") pod \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\" (UID: \"eb9bffe6-0182-47a3-b5f6-86297c6f5c92\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.553283 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-kube-api-access-px26c" (OuterVolumeSpecName: "kube-api-access-px26c") pod "eb9bffe6-0182-47a3-b5f6-86297c6f5c92" (UID: "eb9bffe6-0182-47a3-b5f6-86297c6f5c92"). InnerVolumeSpecName "kube-api-access-px26c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.575803 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb9bffe6-0182-47a3-b5f6-86297c6f5c92" (UID: "eb9bffe6-0182-47a3-b5f6-86297c6f5c92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.595847 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data" (OuterVolumeSpecName: "config-data") pod "eb9bffe6-0182-47a3-b5f6-86297c6f5c92" (UID: "eb9bffe6-0182-47a3-b5f6-86297c6f5c92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.645801 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-db-sync-config-data\") pod \"c93c4643-8ac8-4063-b35f-b6de695f42dd\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.645876 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-config-data\") pod \"c93c4643-8ac8-4063-b35f-b6de695f42dd\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.645944 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wdqr\" (UniqueName: \"kubernetes.io/projected/c93c4643-8ac8-4063-b35f-b6de695f42dd-kube-api-access-2wdqr\") pod \"c93c4643-8ac8-4063-b35f-b6de695f42dd\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.646035 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-combined-ca-bundle\") pod \"c93c4643-8ac8-4063-b35f-b6de695f42dd\" (UID: \"c93c4643-8ac8-4063-b35f-b6de695f42dd\") " Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.646566 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px26c\" (UniqueName: \"kubernetes.io/projected/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-kube-api-access-px26c\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.646596 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.646611 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9bffe6-0182-47a3-b5f6-86297c6f5c92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.649627 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c93c4643-8ac8-4063-b35f-b6de695f42dd" (UID: "c93c4643-8ac8-4063-b35f-b6de695f42dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.649817 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93c4643-8ac8-4063-b35f-b6de695f42dd-kube-api-access-2wdqr" (OuterVolumeSpecName: "kube-api-access-2wdqr") pod "c93c4643-8ac8-4063-b35f-b6de695f42dd" (UID: "c93c4643-8ac8-4063-b35f-b6de695f42dd"). InnerVolumeSpecName "kube-api-access-2wdqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.676695 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c93c4643-8ac8-4063-b35f-b6de695f42dd" (UID: "c93c4643-8ac8-4063-b35f-b6de695f42dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.693251 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-config-data" (OuterVolumeSpecName: "config-data") pod "c93c4643-8ac8-4063-b35f-b6de695f42dd" (UID: "c93c4643-8ac8-4063-b35f-b6de695f42dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.748860 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.748903 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.748935 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wdqr\" (UniqueName: \"kubernetes.io/projected/c93c4643-8ac8-4063-b35f-b6de695f42dd-kube-api-access-2wdqr\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:47 crc kubenswrapper[4776]: I1204 09:59:47.748951 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93c4643-8ac8-4063-b35f-b6de695f42dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.007474 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2lqk" event={"ID":"eb9bffe6-0182-47a3-b5f6-86297c6f5c92","Type":"ContainerDied","Data":"2ae36e58d0b41ac3cb8cc5cc45fd7724ff450b727a213acd8371c5967ac83c66"} Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.007565 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae36e58d0b41ac3cb8cc5cc45fd7724ff450b727a213acd8371c5967ac83c66" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.007517 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2lqk" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.008822 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6vlkr" event={"ID":"c93c4643-8ac8-4063-b35f-b6de695f42dd","Type":"ContainerDied","Data":"b99d806d4624d55f640848821831217e0e40abb813f69d1454098e7d0952c2ea"} Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.009076 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b99d806d4624d55f640848821831217e0e40abb813f69d1454098e7d0952c2ea" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.008887 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6vlkr" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354395 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-62kn7"] Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354788 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7dcd00-07c0-4e29-b608-e43d0e09cfba" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354804 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7dcd00-07c0-4e29-b608-e43d0e09cfba" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354817 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718a3f24-cae1-449e-b979-0058c19dbe4b" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354826 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="718a3f24-cae1-449e-b979-0058c19dbe4b" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354848 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93c4643-8ac8-4063-b35f-b6de695f42dd" containerName="glance-db-sync" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354855 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93c4643-8ac8-4063-b35f-b6de695f42dd" containerName="glance-db-sync" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354872 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4935bc-a17c-4ded-b454-21eb494550e5" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354878 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4935bc-a17c-4ded-b454-21eb494550e5" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354888 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7507bda5-9608-4cd8-b40b-f9e69a06d41c" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354894 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7507bda5-9608-4cd8-b40b-f9e69a06d41c" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354905 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9bffe6-0182-47a3-b5f6-86297c6f5c92" containerName="keystone-db-sync" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354929 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9bffe6-0182-47a3-b5f6-86297c6f5c92" containerName="keystone-db-sync" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354938 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71086560-4754-4032-b819-35a007beb5fd" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354943 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="71086560-4754-4032-b819-35a007beb5fd" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354954 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd69365-f194-4607-ba00-f17ed2acbdb9" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354960 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd69365-f194-4607-ba00-f17ed2acbdb9" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.354973 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2602204-68c3-475a-b689-23ec20a95b9c" containerName="ovn-config" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.354985 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2602204-68c3-475a-b689-23ec20a95b9c" containerName="ovn-config" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355128 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="71086560-4754-4032-b819-35a007beb5fd" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355163 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="718a3f24-cae1-449e-b979-0058c19dbe4b" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355177 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7507bda5-9608-4cd8-b40b-f9e69a06d41c" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355189 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93c4643-8ac8-4063-b35f-b6de695f42dd" containerName="glance-db-sync" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355204 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd69365-f194-4607-ba00-f17ed2acbdb9" containerName="mariadb-database-create" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355214 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7dcd00-07c0-4e29-b608-e43d0e09cfba" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355224 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9bffe6-0182-47a3-b5f6-86297c6f5c92" containerName="keystone-db-sync" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355239 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4935bc-a17c-4ded-b454-21eb494550e5" containerName="mariadb-account-create-update" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355250 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2602204-68c3-475a-b689-23ec20a95b9c" containerName="ovn-config" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.355841 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.369345 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fqt2z"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.370328 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.371241 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.371405 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.371541 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.371870 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cwwkl" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.376979 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.391762 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-62kn7"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.427976 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fqt2z"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.459340 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-fernet-keys\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.460665 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.460829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-credential-keys\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.460975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbjt\" (UniqueName: \"kubernetes.io/projected/151ce16e-85c7-4748-82b4-5ee53c171e75-kube-api-access-hmbjt\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.461084 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.461187 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf29z\" (UniqueName: \"kubernetes.io/projected/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-kube-api-access-tf29z\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.461327 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-scripts\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.461557 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-config\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.461687 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.461954 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-combined-ca-bundle\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.464204 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-config-data\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.565461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-combined-ca-bundle\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.565784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-config-data\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.565938 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-fernet-keys\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566042 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566158 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-credential-keys\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566284 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbjt\" (UniqueName: \"kubernetes.io/projected/151ce16e-85c7-4748-82b4-5ee53c171e75-kube-api-access-hmbjt\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566365 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566468 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf29z\" (UniqueName: \"kubernetes.io/projected/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-kube-api-access-tf29z\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566606 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-scripts\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-config\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.566941 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.569611 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.570567 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.570805 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-config\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.571437 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.575767 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4bt5q"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.576148 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-config-data\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.577061 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-combined-ca-bundle\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.577679 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.579293 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-scripts\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.579752 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-fernet-keys\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.582339 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xdgz8" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.582630 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.585621 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-credential-keys\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.586250 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.591371 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fqt2z"] Dec 04 09:59:48 crc kubenswrapper[4776]: E1204 09:59:48.592204 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-hmbjt], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" podUID="151ce16e-85c7-4748-82b4-5ee53c171e75" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.607941 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbjt\" (UniqueName: \"kubernetes.io/projected/151ce16e-85c7-4748-82b4-5ee53c171e75-kube-api-access-hmbjt\") pod \"dnsmasq-dns-75bb4695fc-fqt2z\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.621824 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4bt5q"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.650418 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf29z\" (UniqueName: \"kubernetes.io/projected/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-kube-api-access-tf29z\") pod \"keystone-bootstrap-62kn7\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.670070 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-db-sync-config-data\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.670127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-combined-ca-bundle\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.670173 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8765b919-d724-4148-8ba8-a550cd8029fc-etc-machine-id\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.670222 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp6d9\" (UniqueName: \"kubernetes.io/projected/8765b919-d724-4148-8ba8-a550cd8029fc-kube-api-access-fp6d9\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.670264 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-config-data\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.670342 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-scripts\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.708236 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-62kn7" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.720519 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-k5qrd"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.728063 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.777929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-config-data\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.778290 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-scripts\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.778339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-db-sync-config-data\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.778382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-combined-ca-bundle\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.778443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8765b919-d724-4148-8ba8-a550cd8029fc-etc-machine-id\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.778476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp6d9\" (UniqueName: \"kubernetes.io/projected/8765b919-d724-4148-8ba8-a550cd8029fc-kube-api-access-fp6d9\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.787309 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-db-sync-config-data\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.788101 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8765b919-d724-4148-8ba8-a550cd8029fc-etc-machine-id\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.792211 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-scripts\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.797844 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-combined-ca-bundle\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.802826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-config-data\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.828049 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-k5qrd"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.855028 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp6d9\" (UniqueName: \"kubernetes.io/projected/8765b919-d724-4148-8ba8-a550cd8029fc-kube-api-access-fp6d9\") pod \"cinder-db-sync-4bt5q\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.871983 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.874154 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.882421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-config\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.882532 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.882577 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2cn\" (UniqueName: \"kubernetes.io/projected/ff2d164b-a423-4431-a0e8-02554236f17f-kube-api-access-hv2cn\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.882667 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-dns-svc\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.882711 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.904550 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.904841 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.908895 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.941738 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-76hnz"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.943127 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.946449 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.946648 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jk975" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.947159 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.973429 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-76hnz"] Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989341 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-config\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-log-httpd\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989498 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-scripts\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989571 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-config-data\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2cn\" (UniqueName: \"kubernetes.io/projected/ff2d164b-a423-4431-a0e8-02554236f17f-kube-api-access-hv2cn\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-dns-svc\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989759 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989787 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9fwn\" (UniqueName: \"kubernetes.io/projected/33bb005b-010a-491e-b788-4ceb11d4c510-kube-api-access-p9fwn\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989838 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989886 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.989928 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-run-httpd\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.992390 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.993091 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-dns-svc\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:48 crc kubenswrapper[4776]: I1204 09:59:48.993644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.000368 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-config\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.056353 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4bt5q" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.061718 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.062583 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2cn\" (UniqueName: \"kubernetes.io/projected/ff2d164b-a423-4431-a0e8-02554236f17f-kube-api-access-hv2cn\") pod \"dnsmasq-dns-6546db6db7-k5qrd\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.110131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.110765 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-run-httpd\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.111004 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-combined-ca-bundle\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.111338 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-log-httpd\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.111476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-scripts\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.111605 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-config-data\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.111794 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td8ws\" (UniqueName: \"kubernetes.io/projected/c107c323-1d20-4a32-82be-2085097e6d5d-kube-api-access-td8ws\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.112745 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.112942 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9fwn\" (UniqueName: \"kubernetes.io/projected/33bb005b-010a-491e-b788-4ceb11d4c510-kube-api-access-p9fwn\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.127721 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.132646 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-config\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.139031 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wqhr2"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.141783 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.145595 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hjjs6" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.145969 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.146704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-run-httpd\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.147214 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.155483 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-log-httpd\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.163044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.202546 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-config-data\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.203645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-scripts\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.234059 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wqhr2"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.237053 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-config\") pod \"151ce16e-85c7-4748-82b4-5ee53c171e75\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.237156 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-dns-svc\") pod \"151ce16e-85c7-4748-82b4-5ee53c171e75\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.237317 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmbjt\" (UniqueName: \"kubernetes.io/projected/151ce16e-85c7-4748-82b4-5ee53c171e75-kube-api-access-hmbjt\") pod \"151ce16e-85c7-4748-82b4-5ee53c171e75\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.237355 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-nb\") pod \"151ce16e-85c7-4748-82b4-5ee53c171e75\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.237404 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-sb\") pod \"151ce16e-85c7-4748-82b4-5ee53c171e75\" (UID: \"151ce16e-85c7-4748-82b4-5ee53c171e75\") " Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.239234 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "151ce16e-85c7-4748-82b4-5ee53c171e75" (UID: "151ce16e-85c7-4748-82b4-5ee53c171e75"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.239270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.239824 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-config" (OuterVolumeSpecName: "config") pod "151ce16e-85c7-4748-82b4-5ee53c171e75" (UID: "151ce16e-85c7-4748-82b4-5ee53c171e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.243469 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "151ce16e-85c7-4748-82b4-5ee53c171e75" (UID: "151ce16e-85c7-4748-82b4-5ee53c171e75"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.245008 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "151ce16e-85c7-4748-82b4-5ee53c171e75" (UID: "151ce16e-85c7-4748-82b4-5ee53c171e75"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.247806 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td8ws\" (UniqueName: \"kubernetes.io/projected/c107c323-1d20-4a32-82be-2085097e6d5d-kube-api-access-td8ws\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.248731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-combined-ca-bundle\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.248984 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-db-sync-config-data\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.249171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-config\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.257559 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-combined-ca-bundle\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.257937 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqrq2\" (UniqueName: \"kubernetes.io/projected/e92fb916-9d62-42d7-bad8-12cd43af37e9-kube-api-access-xqrq2\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.258485 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.258506 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.258523 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.258565 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/151ce16e-85c7-4748-82b4-5ee53c171e75-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.258772 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9fwn\" (UniqueName: \"kubernetes.io/projected/33bb005b-010a-491e-b788-4ceb11d4c510-kube-api-access-p9fwn\") pod \"ceilometer-0\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.262036 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-k5qrd"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.274608 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.296424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151ce16e-85c7-4748-82b4-5ee53c171e75-kube-api-access-hmbjt" (OuterVolumeSpecName: "kube-api-access-hmbjt") pod "151ce16e-85c7-4748-82b4-5ee53c171e75" (UID: "151ce16e-85c7-4748-82b4-5ee53c171e75"). InnerVolumeSpecName "kube-api-access-hmbjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.297023 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-config\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.301542 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-combined-ca-bundle\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.309586 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td8ws\" (UniqueName: \"kubernetes.io/projected/c107c323-1d20-4a32-82be-2085097e6d5d-kube-api-access-td8ws\") pod \"neutron-db-sync-76hnz\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.315774 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-76hnz" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.325036 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-z2nzw"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.326452 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.329611 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.329834 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rd42l" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.331380 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.335573 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z2nzw"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.349714 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-s2n66"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.353042 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.361977 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqrq2\" (UniqueName: \"kubernetes.io/projected/e92fb916-9d62-42d7-bad8-12cd43af37e9-kube-api-access-xqrq2\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.362044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-combined-ca-bundle\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.362084 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-db-sync-config-data\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.362170 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmbjt\" (UniqueName: \"kubernetes.io/projected/151ce16e-85c7-4748-82b4-5ee53c171e75-kube-api-access-hmbjt\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.367572 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-db-sync-config-data\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.383286 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-s2n66"] Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.396212 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqrq2\" (UniqueName: \"kubernetes.io/projected/e92fb916-9d62-42d7-bad8-12cd43af37e9-kube-api-access-xqrq2\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.406562 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-combined-ca-bundle\") pod \"barbican-db-sync-wqhr2\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.465754 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-scripts\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.465818 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1453cb-374e-4b8d-8f13-af4be7baa997-logs\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.465849 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.465871 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-config-data\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.465943 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjvc\" (UniqueName: \"kubernetes.io/projected/db382a72-b559-43b9-ab00-b843f38661a4-kube-api-access-qfjvc\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.466011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.466035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.466092 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-config\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.466121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.466168 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7446\" (UniqueName: \"kubernetes.io/projected/fb1453cb-374e-4b8d-8f13-af4be7baa997-kube-api-access-g7446\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.521880 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqhr2" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.568304 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-config\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570041 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570120 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7446\" (UniqueName: \"kubernetes.io/projected/fb1453cb-374e-4b8d-8f13-af4be7baa997-kube-api-access-g7446\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570159 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-scripts\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1453cb-374e-4b8d-8f13-af4be7baa997-logs\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570219 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-config-data\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjvc\" (UniqueName: \"kubernetes.io/projected/db382a72-b559-43b9-ab00-b843f38661a4-kube-api-access-qfjvc\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570361 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.570394 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.571445 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.573595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.574310 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-config\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.575647 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1453cb-374e-4b8d-8f13-af4be7baa997-logs\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.584635 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.585365 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-scripts\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.588966 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-config-data\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.589709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.593011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7446\" (UniqueName: \"kubernetes.io/projected/fb1453cb-374e-4b8d-8f13-af4be7baa997-kube-api-access-g7446\") pod \"placement-db-sync-z2nzw\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.593394 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjvc\" (UniqueName: \"kubernetes.io/projected/db382a72-b559-43b9-ab00-b843f38661a4-kube-api-access-qfjvc\") pod \"dnsmasq-dns-7987f74bbc-s2n66\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.645693 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.667258 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-62kn7"] Dec 04 09:59:49 crc kubenswrapper[4776]: W1204 09:59:49.695612 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4bcbd97_4710_4df7_9fce_6fb80c0e6d7a.slice/crio-9f121d6995697685425854433f6fa47612f9697f331c7f1aa98c35e57eebfad9 WatchSource:0}: Error finding container 9f121d6995697685425854433f6fa47612f9697f331c7f1aa98c35e57eebfad9: Status 404 returned error can't find the container with id 9f121d6995697685425854433f6fa47612f9697f331c7f1aa98c35e57eebfad9 Dec 04 09:59:49 crc kubenswrapper[4776]: I1204 09:59:49.855586 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z2nzw" Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.079125 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fqt2z" Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.079672 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-62kn7" event={"ID":"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a","Type":"ContainerStarted","Data":"9f121d6995697685425854433f6fa47612f9697f331c7f1aa98c35e57eebfad9"} Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.106031 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-76hnz"] Dec 04 09:59:50 crc kubenswrapper[4776]: W1204 09:59:50.116248 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc107c323_1d20_4a32_82be_2085097e6d5d.slice/crio-1c81e46e75de757dd984cd8fc96a7a859464e43b1cf3d4eae5e4fd9b540bf06a WatchSource:0}: Error finding container 1c81e46e75de757dd984cd8fc96a7a859464e43b1cf3d4eae5e4fd9b540bf06a: Status 404 returned error can't find the container with id 1c81e46e75de757dd984cd8fc96a7a859464e43b1cf3d4eae5e4fd9b540bf06a Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.143682 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4bt5q"] Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.173432 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.185297 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-k5qrd"] Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.232588 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fqt2z"] Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.259354 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fqt2z"] Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.355373 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-s2n66"] Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.368166 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wqhr2"] Dec 04 09:59:50 crc kubenswrapper[4776]: W1204 09:59:50.372830 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb382a72_b559_43b9_ab00_b843f38661a4.slice/crio-4eb00471b9bb2c7af9afe3bc9a623409bf4f7df2449058776632d31435985b79 WatchSource:0}: Error finding container 4eb00471b9bb2c7af9afe3bc9a623409bf4f7df2449058776632d31435985b79: Status 404 returned error can't find the container with id 4eb00471b9bb2c7af9afe3bc9a623409bf4f7df2449058776632d31435985b79 Dec 04 09:59:50 crc kubenswrapper[4776]: I1204 09:59:50.510434 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z2nzw"] Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.110986 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z2nzw" event={"ID":"fb1453cb-374e-4b8d-8f13-af4be7baa997","Type":"ContainerStarted","Data":"07a60280ac6b36b3d45e7b3aa005ebc9fbba33c10558f19c601cf6f9f36c158c"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.115591 4776 generic.go:334] "Generic (PLEG): container finished" podID="ff2d164b-a423-4431-a0e8-02554236f17f" containerID="2eefb2df322e1ea90b852532a895947c7d395d3031ee113dcb05e40276cc7217" exitCode=0 Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.115692 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" event={"ID":"ff2d164b-a423-4431-a0e8-02554236f17f","Type":"ContainerDied","Data":"2eefb2df322e1ea90b852532a895947c7d395d3031ee113dcb05e40276cc7217"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.116340 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" event={"ID":"ff2d164b-a423-4431-a0e8-02554236f17f","Type":"ContainerStarted","Data":"7be307d67b34e71641cb96c36651174ca63d4f98dc755fb83385c918ae1bc81f"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.139688 4776 generic.go:334] "Generic (PLEG): container finished" podID="db382a72-b559-43b9-ab00-b843f38661a4" containerID="cf7e2f6aa22121179ba4435262fa30c4b1be8acd6bcf92582a9aa0c4dc911055" exitCode=0 Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.139794 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" event={"ID":"db382a72-b559-43b9-ab00-b843f38661a4","Type":"ContainerDied","Data":"cf7e2f6aa22121179ba4435262fa30c4b1be8acd6bcf92582a9aa0c4dc911055"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.139823 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" event={"ID":"db382a72-b559-43b9-ab00-b843f38661a4","Type":"ContainerStarted","Data":"4eb00471b9bb2c7af9afe3bc9a623409bf4f7df2449058776632d31435985b79"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.145254 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerStarted","Data":"189483f063d3bb7d5d8d53c7dde372569b221abe145ac2c2aab17f534c9c67c1"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.155336 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-62kn7" event={"ID":"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a","Type":"ContainerStarted","Data":"573e315a75a0b96a36b02683cea5517fd4355e5598a9d565f98a1953a929280b"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.166133 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqhr2" event={"ID":"e92fb916-9d62-42d7-bad8-12cd43af37e9","Type":"ContainerStarted","Data":"c07c6df9b45493430b3ccd52a0a42bd65a3ebe0e5097f03277079d89c126469b"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.173199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-76hnz" event={"ID":"c107c323-1d20-4a32-82be-2085097e6d5d","Type":"ContainerStarted","Data":"e916ce4c0ea710c0a4b38826f12fd7b949e75724ba7e9a41cc90391ee9e3198e"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.173245 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-76hnz" event={"ID":"c107c323-1d20-4a32-82be-2085097e6d5d","Type":"ContainerStarted","Data":"1c81e46e75de757dd984cd8fc96a7a859464e43b1cf3d4eae5e4fd9b540bf06a"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.177184 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4bt5q" event={"ID":"8765b919-d724-4148-8ba8-a550cd8029fc","Type":"ContainerStarted","Data":"35da4911858a3384dbebfe6dba44c45fa85a55230c97ce977587cba5ca3a27d3"} Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.195611 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-62kn7" podStartSLOduration=3.195580225 podStartE2EDuration="3.195580225s" podCreationTimestamp="2025-12-04 09:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:51.194428379 +0000 UTC m=+1236.060908756" watchObservedRunningTime="2025-12-04 09:59:51.195580225 +0000 UTC m=+1236.062060602" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.255907 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-76hnz" podStartSLOduration=3.255877624 podStartE2EDuration="3.255877624s" podCreationTimestamp="2025-12-04 09:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:51.212557067 +0000 UTC m=+1236.079037464" watchObservedRunningTime="2025-12-04 09:59:51.255877624 +0000 UTC m=+1236.122358001" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.488052 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151ce16e-85c7-4748-82b4-5ee53c171e75" path="/var/lib/kubelet/pods/151ce16e-85c7-4748-82b4-5ee53c171e75/volumes" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.671991 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.763428 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.842664 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-dns-svc\") pod \"ff2d164b-a423-4431-a0e8-02554236f17f\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.842756 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-sb\") pod \"ff2d164b-a423-4431-a0e8-02554236f17f\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.842792 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-nb\") pod \"ff2d164b-a423-4431-a0e8-02554236f17f\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.842851 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-config\") pod \"ff2d164b-a423-4431-a0e8-02554236f17f\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.842943 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2cn\" (UniqueName: \"kubernetes.io/projected/ff2d164b-a423-4431-a0e8-02554236f17f-kube-api-access-hv2cn\") pod \"ff2d164b-a423-4431-a0e8-02554236f17f\" (UID: \"ff2d164b-a423-4431-a0e8-02554236f17f\") " Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.853212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2d164b-a423-4431-a0e8-02554236f17f-kube-api-access-hv2cn" (OuterVolumeSpecName: "kube-api-access-hv2cn") pod "ff2d164b-a423-4431-a0e8-02554236f17f" (UID: "ff2d164b-a423-4431-a0e8-02554236f17f"). InnerVolumeSpecName "kube-api-access-hv2cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.873765 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff2d164b-a423-4431-a0e8-02554236f17f" (UID: "ff2d164b-a423-4431-a0e8-02554236f17f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.883727 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff2d164b-a423-4431-a0e8-02554236f17f" (UID: "ff2d164b-a423-4431-a0e8-02554236f17f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.884287 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-config" (OuterVolumeSpecName: "config") pod "ff2d164b-a423-4431-a0e8-02554236f17f" (UID: "ff2d164b-a423-4431-a0e8-02554236f17f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.894215 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff2d164b-a423-4431-a0e8-02554236f17f" (UID: "ff2d164b-a423-4431-a0e8-02554236f17f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.946311 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2cn\" (UniqueName: \"kubernetes.io/projected/ff2d164b-a423-4431-a0e8-02554236f17f-kube-api-access-hv2cn\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.946350 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.946360 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.946369 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:51 crc kubenswrapper[4776]: I1204 09:59:51.946378 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2d164b-a423-4431-a0e8-02554236f17f-config\") on node \"crc\" DevicePath \"\"" Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.200806 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" event={"ID":"ff2d164b-a423-4431-a0e8-02554236f17f","Type":"ContainerDied","Data":"7be307d67b34e71641cb96c36651174ca63d4f98dc755fb83385c918ae1bc81f"} Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.200862 4776 scope.go:117] "RemoveContainer" containerID="2eefb2df322e1ea90b852532a895947c7d395d3031ee113dcb05e40276cc7217" Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.201026 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-k5qrd" Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.221336 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" event={"ID":"db382a72-b559-43b9-ab00-b843f38661a4","Type":"ContainerStarted","Data":"e810eefc0b980ace4079c6069884c75e2ecfc3880d583641bf9995f185d899c8"} Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.222593 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.305396 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" podStartSLOduration=3.30537225 podStartE2EDuration="3.30537225s" podCreationTimestamp="2025-12-04 09:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 09:59:52.255409794 +0000 UTC m=+1237.121890181" watchObservedRunningTime="2025-12-04 09:59:52.30537225 +0000 UTC m=+1237.171852627" Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.319633 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-k5qrd"] Dec 04 09:59:52 crc kubenswrapper[4776]: I1204 09:59:52.328579 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-k5qrd"] Dec 04 09:59:53 crc kubenswrapper[4776]: I1204 09:59:53.475513 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2d164b-a423-4431-a0e8-02554236f17f" path="/var/lib/kubelet/pods/ff2d164b-a423-4431-a0e8-02554236f17f/volumes" Dec 04 09:59:59 crc kubenswrapper[4776]: I1204 09:59:59.319564 4776 generic.go:334] "Generic (PLEG): container finished" podID="f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" containerID="573e315a75a0b96a36b02683cea5517fd4355e5598a9d565f98a1953a929280b" exitCode=0 Dec 04 09:59:59 crc kubenswrapper[4776]: I1204 09:59:59.320387 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-62kn7" event={"ID":"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a","Type":"ContainerDied","Data":"573e315a75a0b96a36b02683cea5517fd4355e5598a9d565f98a1953a929280b"} Dec 04 09:59:59 crc kubenswrapper[4776]: I1204 09:59:59.647715 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 09:59:59 crc kubenswrapper[4776]: I1204 09:59:59.760706 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lc7wm"] Dec 04 09:59:59 crc kubenswrapper[4776]: I1204 09:59:59.761251 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" containerID="cri-o://d7bc5307933b23c3faaaf69ca315ed8153afe1908a6978577a1085fadd6932be" gracePeriod=10 Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.141634 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8"] Dec 04 10:00:00 crc kubenswrapper[4776]: E1204 10:00:00.142114 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2d164b-a423-4431-a0e8-02554236f17f" containerName="init" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.142133 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2d164b-a423-4431-a0e8-02554236f17f" containerName="init" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.142350 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2d164b-a423-4431-a0e8-02554236f17f" containerName="init" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.143102 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.146037 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.146358 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.160816 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8"] Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.226692 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l779\" (UniqueName: \"kubernetes.io/projected/4bb6d2bf-b3df-46c7-8f0a-466805a68315-kube-api-access-8l779\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.226835 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bb6d2bf-b3df-46c7-8f0a-466805a68315-secret-volume\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.227118 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bb6d2bf-b3df-46c7-8f0a-466805a68315-config-volume\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.329367 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bb6d2bf-b3df-46c7-8f0a-466805a68315-config-volume\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.329453 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l779\" (UniqueName: \"kubernetes.io/projected/4bb6d2bf-b3df-46c7-8f0a-466805a68315-kube-api-access-8l779\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.329496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bb6d2bf-b3df-46c7-8f0a-466805a68315-secret-volume\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.330281 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bb6d2bf-b3df-46c7-8f0a-466805a68315-config-volume\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.336103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bb6d2bf-b3df-46c7-8f0a-466805a68315-secret-volume\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.350955 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l779\" (UniqueName: \"kubernetes.io/projected/4bb6d2bf-b3df-46c7-8f0a-466805a68315-kube-api-access-8l779\") pod \"collect-profiles-29414040-7k8l8\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:00 crc kubenswrapper[4776]: I1204 10:00:00.458758 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:01 crc kubenswrapper[4776]: I1204 10:00:01.351203 4776 generic.go:334] "Generic (PLEG): container finished" podID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerID="d7bc5307933b23c3faaaf69ca315ed8153afe1908a6978577a1085fadd6932be" exitCode=0 Dec 04 10:00:01 crc kubenswrapper[4776]: I1204 10:00:01.351549 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" event={"ID":"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9","Type":"ContainerDied","Data":"d7bc5307933b23c3faaaf69ca315ed8153afe1908a6978577a1085fadd6932be"} Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.345764 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-62kn7" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.374147 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-62kn7" event={"ID":"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a","Type":"ContainerDied","Data":"9f121d6995697685425854433f6fa47612f9697f331c7f1aa98c35e57eebfad9"} Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.374190 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f121d6995697685425854433f6fa47612f9697f331c7f1aa98c35e57eebfad9" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.374260 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-62kn7" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.474562 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-scripts\") pod \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.474680 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-config-data\") pod \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.474741 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf29z\" (UniqueName: \"kubernetes.io/projected/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-kube-api-access-tf29z\") pod \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.474781 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-fernet-keys\") pod \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.474819 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-combined-ca-bundle\") pod \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.474846 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-credential-keys\") pod \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\" (UID: \"f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a\") " Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.484296 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-kube-api-access-tf29z" (OuterVolumeSpecName: "kube-api-access-tf29z") pod "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" (UID: "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a"). InnerVolumeSpecName "kube-api-access-tf29z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.485860 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" (UID: "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.486624 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" (UID: "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.489406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-scripts" (OuterVolumeSpecName: "scripts") pod "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" (UID: "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.508116 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-config-data" (OuterVolumeSpecName: "config-data") pod "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" (UID: "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.508418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" (UID: "f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.577269 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.577311 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf29z\" (UniqueName: \"kubernetes.io/projected/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-kube-api-access-tf29z\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.577325 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.577336 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.577347 4776 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:02 crc kubenswrapper[4776]: I1204 10:00:02.577358 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.438432 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-62kn7"] Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.445756 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-62kn7"] Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.463634 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" path="/var/lib/kubelet/pods/f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a/volumes" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.534196 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m2qwc"] Dec 04 10:00:03 crc kubenswrapper[4776]: E1204 10:00:03.534915 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" containerName="keystone-bootstrap" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.534958 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" containerName="keystone-bootstrap" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.535183 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4bcbd97-4710-4df7-9fce-6fb80c0e6d7a" containerName="keystone-bootstrap" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.535807 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.538345 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.538541 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.539157 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.539800 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cwwkl" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.540126 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.546124 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m2qwc"] Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.711445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9zll\" (UniqueName: \"kubernetes.io/projected/67ed9105-24d2-4ffd-9ee6-7dac342541de-kube-api-access-k9zll\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.711499 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-combined-ca-bundle\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.711568 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-credential-keys\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.711724 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-config-data\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.711767 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-fernet-keys\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.711789 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-scripts\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.813149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-credential-keys\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.813219 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-config-data\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.813242 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-fernet-keys\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.813266 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-scripts\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.813376 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9zll\" (UniqueName: \"kubernetes.io/projected/67ed9105-24d2-4ffd-9ee6-7dac342541de-kube-api-access-k9zll\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.813404 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-combined-ca-bundle\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.819128 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-config-data\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.820218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-combined-ca-bundle\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.820987 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-credential-keys\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.823869 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-scripts\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.831827 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-fernet-keys\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.836437 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9zll\" (UniqueName: \"kubernetes.io/projected/67ed9105-24d2-4ffd-9ee6-7dac342541de-kube-api-access-k9zll\") pod \"keystone-bootstrap-m2qwc\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:03 crc kubenswrapper[4776]: I1204 10:00:03.862500 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:08 crc kubenswrapper[4776]: I1204 10:00:08.018224 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 04 10:00:13 crc kubenswrapper[4776]: I1204 10:00:13.018944 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 04 10:00:15 crc kubenswrapper[4776]: E1204 10:00:15.141616 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 04 10:00:15 crc kubenswrapper[4776]: E1204 10:00:15.142197 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g7446,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-z2nzw_openstack(fb1453cb-374e-4b8d-8f13-af4be7baa997): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:15 crc kubenswrapper[4776]: E1204 10:00:15.143392 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-z2nzw" podUID="fb1453cb-374e-4b8d-8f13-af4be7baa997" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.207334 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.402975 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-sb\") pod \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.403233 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lttjg\" (UniqueName: \"kubernetes.io/projected/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-kube-api-access-lttjg\") pod \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.403280 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-config\") pod \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.403315 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-nb\") pod \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.403384 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-dns-svc\") pod \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\" (UID: \"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9\") " Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.408754 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-kube-api-access-lttjg" (OuterVolumeSpecName: "kube-api-access-lttjg") pod "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" (UID: "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9"). InnerVolumeSpecName "kube-api-access-lttjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.447081 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" (UID: "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.450449 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" (UID: "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.452066 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" (UID: "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.460402 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-config" (OuterVolumeSpecName: "config") pod "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" (UID: "ee5d641b-7a26-465b-90c8-ce70bb3ddcb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.506250 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lttjg\" (UniqueName: \"kubernetes.io/projected/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-kube-api-access-lttjg\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.506282 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.506292 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.506300 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.506310 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.511331 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.511400 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" event={"ID":"ee5d641b-7a26-465b-90c8-ce70bb3ddcb9","Type":"ContainerDied","Data":"aac3184854f83782feca7a6a10c73f7f096e70e8d24d520595b373edb02adbc1"} Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.511453 4776 scope.go:117] "RemoveContainer" containerID="d7bc5307933b23c3faaaf69ca315ed8153afe1908a6978577a1085fadd6932be" Dec 04 10:00:15 crc kubenswrapper[4776]: E1204 10:00:15.515430 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-z2nzw" podUID="fb1453cb-374e-4b8d-8f13-af4be7baa997" Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.581127 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lc7wm"] Dec 04 10:00:15 crc kubenswrapper[4776]: I1204 10:00:15.589325 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lc7wm"] Dec 04 10:00:16 crc kubenswrapper[4776]: E1204 10:00:16.420384 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 04 10:00:16 crc kubenswrapper[4776]: E1204 10:00:16.420637 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fp6d9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-4bt5q_openstack(8765b919-d724-4148-8ba8-a550cd8029fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:16 crc kubenswrapper[4776]: E1204 10:00:16.421957 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-4bt5q" podUID="8765b919-d724-4148-8ba8-a550cd8029fc" Dec 04 10:00:16 crc kubenswrapper[4776]: E1204 10:00:16.523144 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-4bt5q" podUID="8765b919-d724-4148-8ba8-a550cd8029fc" Dec 04 10:00:16 crc kubenswrapper[4776]: I1204 10:00:16.895610 4776 scope.go:117] "RemoveContainer" containerID="f4f7150290d9d229bb908f236d53e21e3fbbeb63a65302dcbf1ee1f05aa5f2f2" Dec 04 10:00:17 crc kubenswrapper[4776]: E1204 10:00:17.081410 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 04 10:00:17 crc kubenswrapper[4776]: E1204 10:00:17.090999 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xqrq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wqhr2_openstack(e92fb916-9d62-42d7-bad8-12cd43af37e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:00:17 crc kubenswrapper[4776]: E1204 10:00:17.092317 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wqhr2" podUID="e92fb916-9d62-42d7-bad8-12cd43af37e9" Dec 04 10:00:17 crc kubenswrapper[4776]: W1204 10:00:17.309435 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bb6d2bf_b3df_46c7_8f0a_466805a68315.slice/crio-af15019cc4d8eec4dcec071eb7d806dbc6f5dc0c7b15ca1ba158d3ee6b828882 WatchSource:0}: Error finding container af15019cc4d8eec4dcec071eb7d806dbc6f5dc0c7b15ca1ba158d3ee6b828882: Status 404 returned error can't find the container with id af15019cc4d8eec4dcec071eb7d806dbc6f5dc0c7b15ca1ba158d3ee6b828882 Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.309779 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8"] Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.386827 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m2qwc"] Dec 04 10:00:17 crc kubenswrapper[4776]: W1204 10:00:17.401052 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ed9105_24d2_4ffd_9ee6_7dac342541de.slice/crio-c8f42d4a6b7f47a0240fcb12bc3ae01f0f7abdc34f636f1236e44d43cfa05a31 WatchSource:0}: Error finding container c8f42d4a6b7f47a0240fcb12bc3ae01f0f7abdc34f636f1236e44d43cfa05a31: Status 404 returned error can't find the container with id c8f42d4a6b7f47a0240fcb12bc3ae01f0f7abdc34f636f1236e44d43cfa05a31 Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.407576 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.462302 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" path="/var/lib/kubelet/pods/ee5d641b-7a26-465b-90c8-ce70bb3ddcb9/volumes" Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.532336 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" event={"ID":"4bb6d2bf-b3df-46c7-8f0a-466805a68315","Type":"ContainerStarted","Data":"50084a6e5b0f1ff86902087de9ade4e5514b22c0c90933c58ceeef15b535194a"} Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.532402 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" event={"ID":"4bb6d2bf-b3df-46c7-8f0a-466805a68315","Type":"ContainerStarted","Data":"af15019cc4d8eec4dcec071eb7d806dbc6f5dc0c7b15ca1ba158d3ee6b828882"} Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.535416 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2qwc" event={"ID":"67ed9105-24d2-4ffd-9ee6-7dac342541de","Type":"ContainerStarted","Data":"c8f42d4a6b7f47a0240fcb12bc3ae01f0f7abdc34f636f1236e44d43cfa05a31"} Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.537305 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerStarted","Data":"09a3f622e5f7fdad0bec6a0fd5525519d75286f333b1e29c0f5ec17abf28a2eb"} Dec 04 10:00:17 crc kubenswrapper[4776]: E1204 10:00:17.539618 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wqhr2" podUID="e92fb916-9d62-42d7-bad8-12cd43af37e9" Dec 04 10:00:17 crc kubenswrapper[4776]: I1204 10:00:17.561620 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" podStartSLOduration=17.561596404 podStartE2EDuration="17.561596404s" podCreationTimestamp="2025-12-04 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:17.554819372 +0000 UTC m=+1262.421299759" watchObservedRunningTime="2025-12-04 10:00:17.561596404 +0000 UTC m=+1262.428076791" Dec 04 10:00:18 crc kubenswrapper[4776]: I1204 10:00:18.019758 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-lc7wm" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 04 10:00:18 crc kubenswrapper[4776]: I1204 10:00:18.550580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2qwc" event={"ID":"67ed9105-24d2-4ffd-9ee6-7dac342541de","Type":"ContainerStarted","Data":"dd61dde0dbd678bae605906865df598586c36f51ad77fc54bba91334151612b1"} Dec 04 10:00:18 crc kubenswrapper[4776]: I1204 10:00:18.555323 4776 generic.go:334] "Generic (PLEG): container finished" podID="4bb6d2bf-b3df-46c7-8f0a-466805a68315" containerID="50084a6e5b0f1ff86902087de9ade4e5514b22c0c90933c58ceeef15b535194a" exitCode=0 Dec 04 10:00:18 crc kubenswrapper[4776]: I1204 10:00:18.555452 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" event={"ID":"4bb6d2bf-b3df-46c7-8f0a-466805a68315","Type":"ContainerDied","Data":"50084a6e5b0f1ff86902087de9ade4e5514b22c0c90933c58ceeef15b535194a"} Dec 04 10:00:18 crc kubenswrapper[4776]: I1204 10:00:18.573812 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m2qwc" podStartSLOduration=15.57376869 podStartE2EDuration="15.57376869s" podCreationTimestamp="2025-12-04 10:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:18.569587969 +0000 UTC m=+1263.436068366" watchObservedRunningTime="2025-12-04 10:00:18.57376869 +0000 UTC m=+1263.440249447" Dec 04 10:00:19 crc kubenswrapper[4776]: I1204 10:00:19.568621 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerStarted","Data":"1cb7d87a132f26682ced52d29c09a3fbdd11ffd30ba8d8ab6a108849084b6bab"} Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.043074 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.245255 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l779\" (UniqueName: \"kubernetes.io/projected/4bb6d2bf-b3df-46c7-8f0a-466805a68315-kube-api-access-8l779\") pod \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.245957 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bb6d2bf-b3df-46c7-8f0a-466805a68315-config-volume\") pod \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.246023 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bb6d2bf-b3df-46c7-8f0a-466805a68315-secret-volume\") pod \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\" (UID: \"4bb6d2bf-b3df-46c7-8f0a-466805a68315\") " Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.246777 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb6d2bf-b3df-46c7-8f0a-466805a68315-config-volume" (OuterVolumeSpecName: "config-volume") pod "4bb6d2bf-b3df-46c7-8f0a-466805a68315" (UID: "4bb6d2bf-b3df-46c7-8f0a-466805a68315"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.266489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb6d2bf-b3df-46c7-8f0a-466805a68315-kube-api-access-8l779" (OuterVolumeSpecName: "kube-api-access-8l779") pod "4bb6d2bf-b3df-46c7-8f0a-466805a68315" (UID: "4bb6d2bf-b3df-46c7-8f0a-466805a68315"). InnerVolumeSpecName "kube-api-access-8l779". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.266705 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb6d2bf-b3df-46c7-8f0a-466805a68315-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4bb6d2bf-b3df-46c7-8f0a-466805a68315" (UID: "4bb6d2bf-b3df-46c7-8f0a-466805a68315"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.347678 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bb6d2bf-b3df-46c7-8f0a-466805a68315-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.347717 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bb6d2bf-b3df-46c7-8f0a-466805a68315-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.347727 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l779\" (UniqueName: \"kubernetes.io/projected/4bb6d2bf-b3df-46c7-8f0a-466805a68315-kube-api-access-8l779\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.578605 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" event={"ID":"4bb6d2bf-b3df-46c7-8f0a-466805a68315","Type":"ContainerDied","Data":"af15019cc4d8eec4dcec071eb7d806dbc6f5dc0c7b15ca1ba158d3ee6b828882"} Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.578649 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af15019cc4d8eec4dcec071eb7d806dbc6f5dc0c7b15ca1ba158d3ee6b828882" Dec 04 10:00:20 crc kubenswrapper[4776]: I1204 10:00:20.578656 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8" Dec 04 10:00:22 crc kubenswrapper[4776]: I1204 10:00:22.599739 4776 generic.go:334] "Generic (PLEG): container finished" podID="67ed9105-24d2-4ffd-9ee6-7dac342541de" containerID="dd61dde0dbd678bae605906865df598586c36f51ad77fc54bba91334151612b1" exitCode=0 Dec 04 10:00:22 crc kubenswrapper[4776]: I1204 10:00:22.599848 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2qwc" event={"ID":"67ed9105-24d2-4ffd-9ee6-7dac342541de","Type":"ContainerDied","Data":"dd61dde0dbd678bae605906865df598586c36f51ad77fc54bba91334151612b1"} Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.108984 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.113683 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-scripts\") pod \"67ed9105-24d2-4ffd-9ee6-7dac342541de\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.113753 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9zll\" (UniqueName: \"kubernetes.io/projected/67ed9105-24d2-4ffd-9ee6-7dac342541de-kube-api-access-k9zll\") pod \"67ed9105-24d2-4ffd-9ee6-7dac342541de\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.113855 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-combined-ca-bundle\") pod \"67ed9105-24d2-4ffd-9ee6-7dac342541de\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.117881 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ed9105-24d2-4ffd-9ee6-7dac342541de-kube-api-access-k9zll" (OuterVolumeSpecName: "kube-api-access-k9zll") pod "67ed9105-24d2-4ffd-9ee6-7dac342541de" (UID: "67ed9105-24d2-4ffd-9ee6-7dac342541de"). InnerVolumeSpecName "kube-api-access-k9zll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.118000 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-credential-keys\") pod \"67ed9105-24d2-4ffd-9ee6-7dac342541de\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.118055 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-fernet-keys\") pod \"67ed9105-24d2-4ffd-9ee6-7dac342541de\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.118156 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-config-data\") pod \"67ed9105-24d2-4ffd-9ee6-7dac342541de\" (UID: \"67ed9105-24d2-4ffd-9ee6-7dac342541de\") " Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.118961 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9zll\" (UniqueName: \"kubernetes.io/projected/67ed9105-24d2-4ffd-9ee6-7dac342541de-kube-api-access-k9zll\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.121492 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-scripts" (OuterVolumeSpecName: "scripts") pod "67ed9105-24d2-4ffd-9ee6-7dac342541de" (UID: "67ed9105-24d2-4ffd-9ee6-7dac342541de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.141735 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "67ed9105-24d2-4ffd-9ee6-7dac342541de" (UID: "67ed9105-24d2-4ffd-9ee6-7dac342541de"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.154644 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "67ed9105-24d2-4ffd-9ee6-7dac342541de" (UID: "67ed9105-24d2-4ffd-9ee6-7dac342541de"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.161742 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67ed9105-24d2-4ffd-9ee6-7dac342541de" (UID: "67ed9105-24d2-4ffd-9ee6-7dac342541de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.169062 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-config-data" (OuterVolumeSpecName: "config-data") pod "67ed9105-24d2-4ffd-9ee6-7dac342541de" (UID: "67ed9105-24d2-4ffd-9ee6-7dac342541de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.220224 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.220274 4776 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.220286 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.220298 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.220309 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67ed9105-24d2-4ffd-9ee6-7dac342541de-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.625128 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m2qwc" event={"ID":"67ed9105-24d2-4ffd-9ee6-7dac342541de","Type":"ContainerDied","Data":"c8f42d4a6b7f47a0240fcb12bc3ae01f0f7abdc34f636f1236e44d43cfa05a31"} Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.625165 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m2qwc" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.625189 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f42d4a6b7f47a0240fcb12bc3ae01f0f7abdc34f636f1236e44d43cfa05a31" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.629499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerStarted","Data":"8e5cdea40fcc31480ea3ddee3eac1d825352cc73a58df52abff3a08752092295"} Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.765634 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-86f58b95b9-j2njt"] Dec 04 10:00:24 crc kubenswrapper[4776]: E1204 10:00:24.766616 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb6d2bf-b3df-46c7-8f0a-466805a68315" containerName="collect-profiles" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.766653 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb6d2bf-b3df-46c7-8f0a-466805a68315" containerName="collect-profiles" Dec 04 10:00:24 crc kubenswrapper[4776]: E1204 10:00:24.766670 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="init" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.766679 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="init" Dec 04 10:00:24 crc kubenswrapper[4776]: E1204 10:00:24.766703 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.766713 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" Dec 04 10:00:24 crc kubenswrapper[4776]: E1204 10:00:24.766730 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ed9105-24d2-4ffd-9ee6-7dac342541de" containerName="keystone-bootstrap" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.766738 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ed9105-24d2-4ffd-9ee6-7dac342541de" containerName="keystone-bootstrap" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.766982 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb6d2bf-b3df-46c7-8f0a-466805a68315" containerName="collect-profiles" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.766999 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5d641b-7a26-465b-90c8-ce70bb3ddcb9" containerName="dnsmasq-dns" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.767018 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ed9105-24d2-4ffd-9ee6-7dac342541de" containerName="keystone-bootstrap" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.767669 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.775554 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cwwkl" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.775554 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.775561 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.775699 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.776701 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.777509 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.777590 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86f58b95b9-j2njt"] Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832331 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-config-data\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832386 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-credential-keys\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832421 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhb74\" (UniqueName: \"kubernetes.io/projected/cb3d4759-6025-4713-90f2-7e7825ad18d3-kube-api-access-vhb74\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832447 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-combined-ca-bundle\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832681 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-internal-tls-certs\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832738 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-public-tls-certs\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.832868 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-fernet-keys\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.833000 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-scripts\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939255 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-scripts\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939329 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-config-data\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939376 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-credential-keys\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939415 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhb74\" (UniqueName: \"kubernetes.io/projected/cb3d4759-6025-4713-90f2-7e7825ad18d3-kube-api-access-vhb74\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-combined-ca-bundle\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-internal-tls-certs\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939522 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-public-tls-certs\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.939594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-fernet-keys\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.946656 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-public-tls-certs\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.946681 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-combined-ca-bundle\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.946856 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-internal-tls-certs\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.947589 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-fernet-keys\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.948008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-scripts\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.948104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-config-data\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.949565 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb3d4759-6025-4713-90f2-7e7825ad18d3-credential-keys\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:24 crc kubenswrapper[4776]: I1204 10:00:24.956240 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhb74\" (UniqueName: \"kubernetes.io/projected/cb3d4759-6025-4713-90f2-7e7825ad18d3-kube-api-access-vhb74\") pod \"keystone-86f58b95b9-j2njt\" (UID: \"cb3d4759-6025-4713-90f2-7e7825ad18d3\") " pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:25 crc kubenswrapper[4776]: I1204 10:00:25.131057 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:25 crc kubenswrapper[4776]: I1204 10:00:25.581035 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86f58b95b9-j2njt"] Dec 04 10:00:25 crc kubenswrapper[4776]: I1204 10:00:25.647741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86f58b95b9-j2njt" event={"ID":"cb3d4759-6025-4713-90f2-7e7825ad18d3","Type":"ContainerStarted","Data":"dc88cca9f574543c8158d646bf8b682d6260c00e6bc127cf176f76e8ed08bcb7"} Dec 04 10:00:26 crc kubenswrapper[4776]: I1204 10:00:26.659739 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86f58b95b9-j2njt" event={"ID":"cb3d4759-6025-4713-90f2-7e7825ad18d3","Type":"ContainerStarted","Data":"5a8482e5b769c3bcd9cf3e1d7c627e15058767c1329189fa056659c7422c0ab7"} Dec 04 10:00:26 crc kubenswrapper[4776]: I1204 10:00:26.660336 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:26 crc kubenswrapper[4776]: I1204 10:00:26.684511 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-86f58b95b9-j2njt" podStartSLOduration=2.684483836 podStartE2EDuration="2.684483836s" podCreationTimestamp="2025-12-04 10:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:26.676107884 +0000 UTC m=+1271.542588271" watchObservedRunningTime="2025-12-04 10:00:26.684483836 +0000 UTC m=+1271.550964223" Dec 04 10:00:27 crc kubenswrapper[4776]: I1204 10:00:27.671309 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z2nzw" event={"ID":"fb1453cb-374e-4b8d-8f13-af4be7baa997","Type":"ContainerStarted","Data":"9354f045692036745cc5485d76fe338549e9d13fddee8f13e3c03437b9c26cb5"} Dec 04 10:00:27 crc kubenswrapper[4776]: I1204 10:00:27.673168 4776 generic.go:334] "Generic (PLEG): container finished" podID="c107c323-1d20-4a32-82be-2085097e6d5d" containerID="e916ce4c0ea710c0a4b38826f12fd7b949e75724ba7e9a41cc90391ee9e3198e" exitCode=0 Dec 04 10:00:27 crc kubenswrapper[4776]: I1204 10:00:27.673471 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-76hnz" event={"ID":"c107c323-1d20-4a32-82be-2085097e6d5d","Type":"ContainerDied","Data":"e916ce4c0ea710c0a4b38826f12fd7b949e75724ba7e9a41cc90391ee9e3198e"} Dec 04 10:00:27 crc kubenswrapper[4776]: I1204 10:00:27.694884 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-z2nzw" podStartSLOduration=2.177992894 podStartE2EDuration="38.694866356s" podCreationTimestamp="2025-12-04 09:59:49 +0000 UTC" firstStartedPulling="2025-12-04 09:59:50.528956027 +0000 UTC m=+1235.395436404" lastFinishedPulling="2025-12-04 10:00:27.045829489 +0000 UTC m=+1271.912309866" observedRunningTime="2025-12-04 10:00:27.690638794 +0000 UTC m=+1272.557119191" watchObservedRunningTime="2025-12-04 10:00:27.694866356 +0000 UTC m=+1272.561346733" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.024907 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-76hnz" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.087360 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-config\") pod \"c107c323-1d20-4a32-82be-2085097e6d5d\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.087546 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-combined-ca-bundle\") pod \"c107c323-1d20-4a32-82be-2085097e6d5d\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.087620 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td8ws\" (UniqueName: \"kubernetes.io/projected/c107c323-1d20-4a32-82be-2085097e6d5d-kube-api-access-td8ws\") pod \"c107c323-1d20-4a32-82be-2085097e6d5d\" (UID: \"c107c323-1d20-4a32-82be-2085097e6d5d\") " Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.100253 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c107c323-1d20-4a32-82be-2085097e6d5d-kube-api-access-td8ws" (OuterVolumeSpecName: "kube-api-access-td8ws") pod "c107c323-1d20-4a32-82be-2085097e6d5d" (UID: "c107c323-1d20-4a32-82be-2085097e6d5d"). InnerVolumeSpecName "kube-api-access-td8ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.115139 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-config" (OuterVolumeSpecName: "config") pod "c107c323-1d20-4a32-82be-2085097e6d5d" (UID: "c107c323-1d20-4a32-82be-2085097e6d5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.116617 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c107c323-1d20-4a32-82be-2085097e6d5d" (UID: "c107c323-1d20-4a32-82be-2085097e6d5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.189750 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td8ws\" (UniqueName: \"kubernetes.io/projected/c107c323-1d20-4a32-82be-2085097e6d5d-kube-api-access-td8ws\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.190101 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.190200 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107c323-1d20-4a32-82be-2085097e6d5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.722005 4776 generic.go:334] "Generic (PLEG): container finished" podID="fb1453cb-374e-4b8d-8f13-af4be7baa997" containerID="9354f045692036745cc5485d76fe338549e9d13fddee8f13e3c03437b9c26cb5" exitCode=0 Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.722136 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z2nzw" event={"ID":"fb1453cb-374e-4b8d-8f13-af4be7baa997","Type":"ContainerDied","Data":"9354f045692036745cc5485d76fe338549e9d13fddee8f13e3c03437b9c26cb5"} Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.726655 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-76hnz" event={"ID":"c107c323-1d20-4a32-82be-2085097e6d5d","Type":"ContainerDied","Data":"1c81e46e75de757dd984cd8fc96a7a859464e43b1cf3d4eae5e4fd9b540bf06a"} Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.726700 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c81e46e75de757dd984cd8fc96a7a859464e43b1cf3d4eae5e4fd9b540bf06a" Dec 04 10:00:32 crc kubenswrapper[4776]: I1204 10:00:32.726757 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-76hnz" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.178278 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-f8rhj"] Dec 04 10:00:33 crc kubenswrapper[4776]: E1204 10:00:33.178633 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107c323-1d20-4a32-82be-2085097e6d5d" containerName="neutron-db-sync" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.178645 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107c323-1d20-4a32-82be-2085097e6d5d" containerName="neutron-db-sync" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.178828 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107c323-1d20-4a32-82be-2085097e6d5d" containerName="neutron-db-sync" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.179709 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.242103 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-f8rhj"] Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.308083 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.308167 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.308229 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-dns-svc\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.308291 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t276q\" (UniqueName: \"kubernetes.io/projected/299f0b77-37c3-4d06-9e8b-00b52cdc5899-kube-api-access-t276q\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.308382 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-config\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.411187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t276q\" (UniqueName: \"kubernetes.io/projected/299f0b77-37c3-4d06-9e8b-00b52cdc5899-kube-api-access-t276q\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.411324 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-config\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.411360 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.411407 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.411456 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-dns-svc\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.412613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-dns-svc\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.413043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-config\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.413381 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.413786 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.437870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t276q\" (UniqueName: \"kubernetes.io/projected/299f0b77-37c3-4d06-9e8b-00b52cdc5899-kube-api-access-t276q\") pod \"dnsmasq-dns-7b946d459c-f8rhj\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.499840 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.501990 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-675d968b4d-rm7mc"] Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.506330 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.516877 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-675d968b4d-rm7mc"] Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.517675 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.518844 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.519248 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jk975" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.520993 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.719691 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-ovndb-tls-certs\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.719799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxzz\" (UniqueName: \"kubernetes.io/projected/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-kube-api-access-qsxzz\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.719827 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-combined-ca-bundle\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.719886 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-config\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.719904 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-httpd-config\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.738717 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4bt5q" event={"ID":"8765b919-d724-4148-8ba8-a550cd8029fc","Type":"ContainerStarted","Data":"add6ce5137423dbc4ec61824b592be617854d5f0fc003d8126228cc875b1adec"} Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.763191 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerStarted","Data":"d826c52cfa06aa6a5e6583495f39c26a67067541ccfc95bab1e1a3e5dce54b60"} Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.763344 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-central-agent" containerID="cri-o://09a3f622e5f7fdad0bec6a0fd5525519d75286f333b1e29c0f5ec17abf28a2eb" gracePeriod=30 Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.763851 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.764255 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="sg-core" containerID="cri-o://8e5cdea40fcc31480ea3ddee3eac1d825352cc73a58df52abff3a08752092295" gracePeriod=30 Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.764356 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-notification-agent" containerID="cri-o://1cb7d87a132f26682ced52d29c09a3fbdd11ffd30ba8d8ab6a108849084b6bab" gracePeriod=30 Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.764389 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="proxy-httpd" containerID="cri-o://d826c52cfa06aa6a5e6583495f39c26a67067541ccfc95bab1e1a3e5dce54b60" gracePeriod=30 Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.775442 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4bt5q" podStartSLOduration=3.396820111 podStartE2EDuration="45.775419238s" podCreationTimestamp="2025-12-04 09:59:48 +0000 UTC" firstStartedPulling="2025-12-04 09:59:50.151272792 +0000 UTC m=+1235.017753169" lastFinishedPulling="2025-12-04 10:00:32.529871929 +0000 UTC m=+1277.396352296" observedRunningTime="2025-12-04 10:00:33.764029971 +0000 UTC m=+1278.630510368" watchObservedRunningTime="2025-12-04 10:00:33.775419238 +0000 UTC m=+1278.641899615" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.791235 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.384896668 podStartE2EDuration="45.791215923s" podCreationTimestamp="2025-12-04 09:59:48 +0000 UTC" firstStartedPulling="2025-12-04 09:59:50.154137312 +0000 UTC m=+1235.020617689" lastFinishedPulling="2025-12-04 10:00:32.560456567 +0000 UTC m=+1277.426936944" observedRunningTime="2025-12-04 10:00:33.789808209 +0000 UTC m=+1278.656288606" watchObservedRunningTime="2025-12-04 10:00:33.791215923 +0000 UTC m=+1278.657696290" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.821189 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxzz\" (UniqueName: \"kubernetes.io/projected/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-kube-api-access-qsxzz\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.821245 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-combined-ca-bundle\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.821312 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-config\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.821332 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-httpd-config\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.821352 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-ovndb-tls-certs\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.827626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-httpd-config\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.829622 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-ovndb-tls-certs\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.829855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-combined-ca-bundle\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.832721 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-config\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.843803 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxzz\" (UniqueName: \"kubernetes.io/projected/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-kube-api-access-qsxzz\") pod \"neutron-675d968b4d-rm7mc\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:33 crc kubenswrapper[4776]: I1204 10:00:33.944879 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.099843 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-f8rhj"] Dec 04 10:00:34 crc kubenswrapper[4776]: W1204 10:00:34.114702 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod299f0b77_37c3_4d06_9e8b_00b52cdc5899.slice/crio-ec64924e65e09260064e0b669fecc242cfb054fc49068cc7dafce73bdfd74982 WatchSource:0}: Error finding container ec64924e65e09260064e0b669fecc242cfb054fc49068cc7dafce73bdfd74982: Status 404 returned error can't find the container with id ec64924e65e09260064e0b669fecc242cfb054fc49068cc7dafce73bdfd74982 Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.206156 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z2nzw" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.332493 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1453cb-374e-4b8d-8f13-af4be7baa997-logs\") pod \"fb1453cb-374e-4b8d-8f13-af4be7baa997\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.333087 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7446\" (UniqueName: \"kubernetes.io/projected/fb1453cb-374e-4b8d-8f13-af4be7baa997-kube-api-access-g7446\") pod \"fb1453cb-374e-4b8d-8f13-af4be7baa997\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.333192 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-config-data\") pod \"fb1453cb-374e-4b8d-8f13-af4be7baa997\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.333244 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle\") pod \"fb1453cb-374e-4b8d-8f13-af4be7baa997\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.333336 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-scripts\") pod \"fb1453cb-374e-4b8d-8f13-af4be7baa997\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.354940 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1453cb-374e-4b8d-8f13-af4be7baa997-logs" (OuterVolumeSpecName: "logs") pod "fb1453cb-374e-4b8d-8f13-af4be7baa997" (UID: "fb1453cb-374e-4b8d-8f13-af4be7baa997"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.357152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-scripts" (OuterVolumeSpecName: "scripts") pod "fb1453cb-374e-4b8d-8f13-af4be7baa997" (UID: "fb1453cb-374e-4b8d-8f13-af4be7baa997"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.357532 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1453cb-374e-4b8d-8f13-af4be7baa997-kube-api-access-g7446" (OuterVolumeSpecName: "kube-api-access-g7446") pod "fb1453cb-374e-4b8d-8f13-af4be7baa997" (UID: "fb1453cb-374e-4b8d-8f13-af4be7baa997"). InnerVolumeSpecName "kube-api-access-g7446". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:34 crc kubenswrapper[4776]: E1204 10:00:34.383392 4776 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle podName:fb1453cb-374e-4b8d-8f13-af4be7baa997 nodeName:}" failed. No retries permitted until 2025-12-04 10:00:34.883359698 +0000 UTC m=+1279.749840075 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle") pod "fb1453cb-374e-4b8d-8f13-af4be7baa997" (UID: "fb1453cb-374e-4b8d-8f13-af4be7baa997") : error deleting /var/lib/kubelet/pods/fb1453cb-374e-4b8d-8f13-af4be7baa997/volume-subpaths: remove /var/lib/kubelet/pods/fb1453cb-374e-4b8d-8f13-af4be7baa997/volume-subpaths: no such file or directory Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.389150 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-config-data" (OuterVolumeSpecName: "config-data") pod "fb1453cb-374e-4b8d-8f13-af4be7baa997" (UID: "fb1453cb-374e-4b8d-8f13-af4be7baa997"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.435355 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7446\" (UniqueName: \"kubernetes.io/projected/fb1453cb-374e-4b8d-8f13-af4be7baa997-kube-api-access-g7446\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.435408 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.435418 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.435426 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1453cb-374e-4b8d-8f13-af4be7baa997-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.797413 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqhr2" event={"ID":"e92fb916-9d62-42d7-bad8-12cd43af37e9","Type":"ContainerStarted","Data":"1d35f9783f8cf066b45481ccf0829d2b056dc3a577d52a6bf4b0a45768cc5c20"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.801407 4776 generic.go:334] "Generic (PLEG): container finished" podID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerID="37ead3e24e70c43f4465288ca36a0c84235e417a9fe4d13592f23fe789c38de5" exitCode=0 Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.801509 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" event={"ID":"299f0b77-37c3-4d06-9e8b-00b52cdc5899","Type":"ContainerDied","Data":"37ead3e24e70c43f4465288ca36a0c84235e417a9fe4d13592f23fe789c38de5"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.801539 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" event={"ID":"299f0b77-37c3-4d06-9e8b-00b52cdc5899","Type":"ContainerStarted","Data":"ec64924e65e09260064e0b669fecc242cfb054fc49068cc7dafce73bdfd74982"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.810957 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z2nzw" event={"ID":"fb1453cb-374e-4b8d-8f13-af4be7baa997","Type":"ContainerDied","Data":"07a60280ac6b36b3d45e7b3aa005ebc9fbba33c10558f19c601cf6f9f36c158c"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.810990 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07a60280ac6b36b3d45e7b3aa005ebc9fbba33c10558f19c601cf6f9f36c158c" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.811010 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z2nzw" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.814507 4776 generic.go:334] "Generic (PLEG): container finished" podID="33bb005b-010a-491e-b788-4ceb11d4c510" containerID="d826c52cfa06aa6a5e6583495f39c26a67067541ccfc95bab1e1a3e5dce54b60" exitCode=0 Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.814526 4776 generic.go:334] "Generic (PLEG): container finished" podID="33bb005b-010a-491e-b788-4ceb11d4c510" containerID="8e5cdea40fcc31480ea3ddee3eac1d825352cc73a58df52abff3a08752092295" exitCode=2 Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.814533 4776 generic.go:334] "Generic (PLEG): container finished" podID="33bb005b-010a-491e-b788-4ceb11d4c510" containerID="09a3f622e5f7fdad0bec6a0fd5525519d75286f333b1e29c0f5ec17abf28a2eb" exitCode=0 Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.814546 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerDied","Data":"d826c52cfa06aa6a5e6583495f39c26a67067541ccfc95bab1e1a3e5dce54b60"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.814561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerDied","Data":"8e5cdea40fcc31480ea3ddee3eac1d825352cc73a58df52abff3a08752092295"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.814571 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerDied","Data":"09a3f622e5f7fdad0bec6a0fd5525519d75286f333b1e29c0f5ec17abf28a2eb"} Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.867472 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wqhr2" podStartSLOduration=3.840563405 podStartE2EDuration="46.867449836s" podCreationTimestamp="2025-12-04 09:59:48 +0000 UTC" firstStartedPulling="2025-12-04 09:59:50.38415488 +0000 UTC m=+1235.250635257" lastFinishedPulling="2025-12-04 10:00:33.411041311 +0000 UTC m=+1278.277521688" observedRunningTime="2025-12-04 10:00:34.824108899 +0000 UTC m=+1279.690589276" watchObservedRunningTime="2025-12-04 10:00:34.867449836 +0000 UTC m=+1279.733930223" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.912963 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d564c574b-x8jlb"] Dec 04 10:00:34 crc kubenswrapper[4776]: E1204 10:00:34.913374 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1453cb-374e-4b8d-8f13-af4be7baa997" containerName="placement-db-sync" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.913391 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1453cb-374e-4b8d-8f13-af4be7baa997" containerName="placement-db-sync" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.913585 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1453cb-374e-4b8d-8f13-af4be7baa997" containerName="placement-db-sync" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.914785 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.918028 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.918196 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.932523 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d564c574b-x8jlb"] Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.946065 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle\") pod \"fb1453cb-374e-4b8d-8f13-af4be7baa997\" (UID: \"fb1453cb-374e-4b8d-8f13-af4be7baa997\") " Dec 04 10:00:34 crc kubenswrapper[4776]: I1204 10:00:34.951548 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb1453cb-374e-4b8d-8f13-af4be7baa997" (UID: "fb1453cb-374e-4b8d-8f13-af4be7baa997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.050467 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-config-data\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.050735 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-internal-tls-certs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.050798 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-logs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.050819 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-combined-ca-bundle\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.051062 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-scripts\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.051186 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gskr\" (UniqueName: \"kubernetes.io/projected/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-kube-api-access-8gskr\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.051234 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-public-tls-certs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.051614 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1453cb-374e-4b8d-8f13-af4be7baa997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-scripts\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153561 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gskr\" (UniqueName: \"kubernetes.io/projected/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-kube-api-access-8gskr\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153601 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-public-tls-certs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153672 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-config-data\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-internal-tls-certs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-logs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.153845 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-combined-ca-bundle\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.154605 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-logs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.160242 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-combined-ca-bundle\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.160682 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-config-data\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.162694 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-public-tls-certs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.162988 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-scripts\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.174640 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-internal-tls-certs\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.183673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gskr\" (UniqueName: \"kubernetes.io/projected/38afdd55-240c-4460-aa5f-2dbbeb0b0f29-kube-api-access-8gskr\") pod \"placement-6d564c574b-x8jlb\" (UID: \"38afdd55-240c-4460-aa5f-2dbbeb0b0f29\") " pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.347886 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.601802 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5779dfffd5-drdt5"] Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.604070 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.614551 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.614854 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.616045 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5779dfffd5-drdt5"] Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.728408 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-675d968b4d-rm7mc"] Dec 04 10:00:35 crc kubenswrapper[4776]: W1204 10:00:35.728959 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf60afb42_ba20_437f_bbd8_f2c50de3e2d1.slice/crio-be58cee1d66c0577a9c1dd0bd7587de2ed3149aa78b1f36dc44c97e58cdd6f7d WatchSource:0}: Error finding container be58cee1d66c0577a9c1dd0bd7587de2ed3149aa78b1f36dc44c97e58cdd6f7d: Status 404 returned error can't find the container with id be58cee1d66c0577a9c1dd0bd7587de2ed3149aa78b1f36dc44c97e58cdd6f7d Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-internal-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-public-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-ovndb-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767413 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-httpd-config\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767484 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-combined-ca-bundle\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767508 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht2c2\" (UniqueName: \"kubernetes.io/projected/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-kube-api-access-ht2c2\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.767586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-config\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.827325 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" event={"ID":"299f0b77-37c3-4d06-9e8b-00b52cdc5899","Type":"ContainerStarted","Data":"88a60914ee32017d04acb27b1f6f51432445649f9c5bb75c8540c2775e979a35"} Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.827390 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.829704 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675d968b4d-rm7mc" event={"ID":"f60afb42-ba20-437f-bbd8-f2c50de3e2d1","Type":"ContainerStarted","Data":"be58cee1d66c0577a9c1dd0bd7587de2ed3149aa78b1f36dc44c97e58cdd6f7d"} Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.850972 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" podStartSLOduration=2.850953674 podStartE2EDuration="2.850953674s" podCreationTimestamp="2025-12-04 10:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:35.849360535 +0000 UTC m=+1280.715840912" watchObservedRunningTime="2025-12-04 10:00:35.850953674 +0000 UTC m=+1280.717434051" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869149 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-internal-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869336 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-public-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869439 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-ovndb-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869542 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-httpd-config\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869637 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-combined-ca-bundle\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869702 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht2c2\" (UniqueName: \"kubernetes.io/projected/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-kube-api-access-ht2c2\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.869775 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-config\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.871180 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d564c574b-x8jlb"] Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.874317 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-httpd-config\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.874378 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-ovndb-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.876181 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-combined-ca-bundle\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.876996 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-public-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.877705 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-internal-tls-certs\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.879754 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-config\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.891644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht2c2\" (UniqueName: \"kubernetes.io/projected/4cbc85fe-8de3-45de-83d6-69da6e1b18d4-kube-api-access-ht2c2\") pod \"neutron-5779dfffd5-drdt5\" (UID: \"4cbc85fe-8de3-45de-83d6-69da6e1b18d4\") " pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:35 crc kubenswrapper[4776]: I1204 10:00:35.947355 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.475657 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5779dfffd5-drdt5"] Dec 04 10:00:36 crc kubenswrapper[4776]: W1204 10:00:36.479526 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cbc85fe_8de3_45de_83d6_69da6e1b18d4.slice/crio-86a59e189ec1dc02a77cb58de6a2914e826450e0a1f638c2dd79a301df2dfad1 WatchSource:0}: Error finding container 86a59e189ec1dc02a77cb58de6a2914e826450e0a1f638c2dd79a301df2dfad1: Status 404 returned error can't find the container with id 86a59e189ec1dc02a77cb58de6a2914e826450e0a1f638c2dd79a301df2dfad1 Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.838446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d564c574b-x8jlb" event={"ID":"38afdd55-240c-4460-aa5f-2dbbeb0b0f29","Type":"ContainerStarted","Data":"0225a702b793db511d82c4ab23c7fde1a13e15378b41bc3072c66b62934f2cf7"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.838484 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d564c574b-x8jlb" event={"ID":"38afdd55-240c-4460-aa5f-2dbbeb0b0f29","Type":"ContainerStarted","Data":"c48ebe25479210954f46a7774396d24ab86d274629505bab5a163911e70e27c5"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.838503 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d564c574b-x8jlb" event={"ID":"38afdd55-240c-4460-aa5f-2dbbeb0b0f29","Type":"ContainerStarted","Data":"6064c070e601e1204bce27edada2bebd07ce4b58fe15fa14073b57c839e5e1cc"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.838601 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.839764 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675d968b4d-rm7mc" event={"ID":"f60afb42-ba20-437f-bbd8-f2c50de3e2d1","Type":"ContainerStarted","Data":"c66bd142f0b1256bba10d9c5a058d40e0ea822795f39936d5a6646d14595770e"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.839786 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675d968b4d-rm7mc" event={"ID":"f60afb42-ba20-437f-bbd8-f2c50de3e2d1","Type":"ContainerStarted","Data":"84233a5c27c5c03371efd40bf7d908ff17ce80861169afb025e56cd41ee3b25b"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.840222 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.842105 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5779dfffd5-drdt5" event={"ID":"4cbc85fe-8de3-45de-83d6-69da6e1b18d4","Type":"ContainerStarted","Data":"ab485e3249e795b7ffbd98047940a740101d051b1a84df7759cd98d127e0928f"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.842129 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5779dfffd5-drdt5" event={"ID":"4cbc85fe-8de3-45de-83d6-69da6e1b18d4","Type":"ContainerStarted","Data":"86a59e189ec1dc02a77cb58de6a2914e826450e0a1f638c2dd79a301df2dfad1"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.844102 4776 generic.go:334] "Generic (PLEG): container finished" podID="33bb005b-010a-491e-b788-4ceb11d4c510" containerID="1cb7d87a132f26682ced52d29c09a3fbdd11ffd30ba8d8ab6a108849084b6bab" exitCode=0 Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.844859 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerDied","Data":"1cb7d87a132f26682ced52d29c09a3fbdd11ffd30ba8d8ab6a108849084b6bab"} Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.865602 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d564c574b-x8jlb" podStartSLOduration=2.865575967 podStartE2EDuration="2.865575967s" podCreationTimestamp="2025-12-04 10:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:36.858000829 +0000 UTC m=+1281.724481206" watchObservedRunningTime="2025-12-04 10:00:36.865575967 +0000 UTC m=+1281.732056344" Dec 04 10:00:36 crc kubenswrapper[4776]: I1204 10:00:36.901603 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-675d968b4d-rm7mc" podStartSLOduration=3.901581146 podStartE2EDuration="3.901581146s" podCreationTimestamp="2025-12-04 10:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:36.887499004 +0000 UTC m=+1281.753979381" watchObservedRunningTime="2025-12-04 10:00:36.901581146 +0000 UTC m=+1281.768061543" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.175857 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.306740 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9fwn\" (UniqueName: \"kubernetes.io/projected/33bb005b-010a-491e-b788-4ceb11d4c510-kube-api-access-p9fwn\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.306789 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-run-httpd\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.306943 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-log-httpd\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.306991 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-scripts\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.307005 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-config-data\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.307088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-sg-core-conf-yaml\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.307110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-combined-ca-bundle\") pod \"33bb005b-010a-491e-b788-4ceb11d4c510\" (UID: \"33bb005b-010a-491e-b788-4ceb11d4c510\") " Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.308666 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.308960 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.315155 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-scripts" (OuterVolumeSpecName: "scripts") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.315301 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bb005b-010a-491e-b788-4ceb11d4c510-kube-api-access-p9fwn" (OuterVolumeSpecName: "kube-api-access-p9fwn") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "kube-api-access-p9fwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.339335 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.398951 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.406801 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-config-data" (OuterVolumeSpecName: "config-data") pod "33bb005b-010a-491e-b788-4ceb11d4c510" (UID: "33bb005b-010a-491e-b788-4ceb11d4c510"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409722 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9fwn\" (UniqueName: \"kubernetes.io/projected/33bb005b-010a-491e-b788-4ceb11d4c510-kube-api-access-p9fwn\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409764 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409777 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bb005b-010a-491e-b788-4ceb11d4c510-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409790 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409802 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409813 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.409825 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bb005b-010a-491e-b788-4ceb11d4c510-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.853895 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5779dfffd5-drdt5" event={"ID":"4cbc85fe-8de3-45de-83d6-69da6e1b18d4","Type":"ContainerStarted","Data":"1bee25e82cc7c53f23592514efeb0f2d907e1da3dbbf16de29b99efd2a76bb1c"} Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.854679 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.858378 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.858972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bb005b-010a-491e-b788-4ceb11d4c510","Type":"ContainerDied","Data":"189483f063d3bb7d5d8d53c7dde372569b221abe145ac2c2aab17f534c9c67c1"} Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.859009 4776 scope.go:117] "RemoveContainer" containerID="d826c52cfa06aa6a5e6583495f39c26a67067541ccfc95bab1e1a3e5dce54b60" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.859762 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.878969 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5779dfffd5-drdt5" podStartSLOduration=2.878951731 podStartE2EDuration="2.878951731s" podCreationTimestamp="2025-12-04 10:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:37.874664656 +0000 UTC m=+1282.741145033" watchObservedRunningTime="2025-12-04 10:00:37.878951731 +0000 UTC m=+1282.745432108" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.880050 4776 scope.go:117] "RemoveContainer" containerID="8e5cdea40fcc31480ea3ddee3eac1d825352cc73a58df52abff3a08752092295" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.901703 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.912060 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.913578 4776 scope.go:117] "RemoveContainer" containerID="1cb7d87a132f26682ced52d29c09a3fbdd11ffd30ba8d8ab6a108849084b6bab" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.934061 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:00:37 crc kubenswrapper[4776]: E1204 10:00:37.937532 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="sg-core" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.937565 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="sg-core" Dec 04 10:00:37 crc kubenswrapper[4776]: E1204 10:00:37.937588 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-notification-agent" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.937596 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-notification-agent" Dec 04 10:00:37 crc kubenswrapper[4776]: E1204 10:00:37.937622 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-central-agent" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.937629 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-central-agent" Dec 04 10:00:37 crc kubenswrapper[4776]: E1204 10:00:37.944293 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="proxy-httpd" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.944324 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="proxy-httpd" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.963414 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-central-agent" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.963487 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="proxy-httpd" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.963513 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="ceilometer-notification-agent" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.963534 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" containerName="sg-core" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.969153 4776 scope.go:117] "RemoveContainer" containerID="09a3f622e5f7fdad0bec6a0fd5525519d75286f333b1e29c0f5ec17abf28a2eb" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.970372 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.972226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.977689 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:00:37 crc kubenswrapper[4776]: I1204 10:00:37.977723 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.124561 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-scripts\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.124657 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-config-data\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.124729 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-log-httpd\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.124765 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mm9z\" (UniqueName: \"kubernetes.io/projected/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-kube-api-access-8mm9z\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.124788 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-run-httpd\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.124960 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.125061 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.226730 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-log-httpd\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.226800 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mm9z\" (UniqueName: \"kubernetes.io/projected/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-kube-api-access-8mm9z\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.226823 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-run-httpd\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.226891 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.226945 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.226984 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-scripts\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.227053 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-config-data\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.227218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-log-httpd\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.227392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-run-httpd\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.232739 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.232887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-config-data\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.243826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.244003 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-scripts\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.250277 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mm9z\" (UniqueName: \"kubernetes.io/projected/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-kube-api-access-8mm9z\") pod \"ceilometer-0\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.339903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.796597 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:00:38 crc kubenswrapper[4776]: W1204 10:00:38.802186 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d1f4eb3_1e4e_424a_95cc_6da3be46a54a.slice/crio-8ce808209a6a415fee28f7549f241c6b343b42d31c18905219c2307e69735f4f WatchSource:0}: Error finding container 8ce808209a6a415fee28f7549f241c6b343b42d31c18905219c2307e69735f4f: Status 404 returned error can't find the container with id 8ce808209a6a415fee28f7549f241c6b343b42d31c18905219c2307e69735f4f Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.869791 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerStarted","Data":"8ce808209a6a415fee28f7549f241c6b343b42d31c18905219c2307e69735f4f"} Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.873163 4776 generic.go:334] "Generic (PLEG): container finished" podID="e92fb916-9d62-42d7-bad8-12cd43af37e9" containerID="1d35f9783f8cf066b45481ccf0829d2b056dc3a577d52a6bf4b0a45768cc5c20" exitCode=0 Dec 04 10:00:38 crc kubenswrapper[4776]: I1204 10:00:38.873322 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqhr2" event={"ID":"e92fb916-9d62-42d7-bad8-12cd43af37e9","Type":"ContainerDied","Data":"1d35f9783f8cf066b45481ccf0829d2b056dc3a577d52a6bf4b0a45768cc5c20"} Dec 04 10:00:39 crc kubenswrapper[4776]: I1204 10:00:39.488768 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33bb005b-010a-491e-b788-4ceb11d4c510" path="/var/lib/kubelet/pods/33bb005b-010a-491e-b788-4ceb11d4c510/volumes" Dec 04 10:00:39 crc kubenswrapper[4776]: I1204 10:00:39.888546 4776 generic.go:334] "Generic (PLEG): container finished" podID="8765b919-d724-4148-8ba8-a550cd8029fc" containerID="add6ce5137423dbc4ec61824b592be617854d5f0fc003d8126228cc875b1adec" exitCode=0 Dec 04 10:00:39 crc kubenswrapper[4776]: I1204 10:00:39.889063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4bt5q" event={"ID":"8765b919-d724-4148-8ba8-a550cd8029fc","Type":"ContainerDied","Data":"add6ce5137423dbc4ec61824b592be617854d5f0fc003d8126228cc875b1adec"} Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.231949 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqhr2" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.363353 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqrq2\" (UniqueName: \"kubernetes.io/projected/e92fb916-9d62-42d7-bad8-12cd43af37e9-kube-api-access-xqrq2\") pod \"e92fb916-9d62-42d7-bad8-12cd43af37e9\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.363442 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-combined-ca-bundle\") pod \"e92fb916-9d62-42d7-bad8-12cd43af37e9\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.363587 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-db-sync-config-data\") pod \"e92fb916-9d62-42d7-bad8-12cd43af37e9\" (UID: \"e92fb916-9d62-42d7-bad8-12cd43af37e9\") " Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.369602 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92fb916-9d62-42d7-bad8-12cd43af37e9-kube-api-access-xqrq2" (OuterVolumeSpecName: "kube-api-access-xqrq2") pod "e92fb916-9d62-42d7-bad8-12cd43af37e9" (UID: "e92fb916-9d62-42d7-bad8-12cd43af37e9"). InnerVolumeSpecName "kube-api-access-xqrq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.369746 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e92fb916-9d62-42d7-bad8-12cd43af37e9" (UID: "e92fb916-9d62-42d7-bad8-12cd43af37e9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.394872 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e92fb916-9d62-42d7-bad8-12cd43af37e9" (UID: "e92fb916-9d62-42d7-bad8-12cd43af37e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.465901 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.465972 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e92fb916-9d62-42d7-bad8-12cd43af37e9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.465983 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqrq2\" (UniqueName: \"kubernetes.io/projected/e92fb916-9d62-42d7-bad8-12cd43af37e9-kube-api-access-xqrq2\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.902744 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqhr2" event={"ID":"e92fb916-9d62-42d7-bad8-12cd43af37e9","Type":"ContainerDied","Data":"c07c6df9b45493430b3ccd52a0a42bd65a3ebe0e5097f03277079d89c126469b"} Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.903442 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c07c6df9b45493430b3ccd52a0a42bd65a3ebe0e5097f03277079d89c126469b" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.902830 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqhr2" Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.908378 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerStarted","Data":"27c8ed4d7484083473dc69f99f4342884d26a6c480b5ed536e99164920c5d362"} Dec 04 10:00:40 crc kubenswrapper[4776]: I1204 10:00:40.908429 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerStarted","Data":"50c7c84857d456728e1600cd096e04ba168067e054852148ae76c2e5f23dea3e"} Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.227134 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-854556677-xxtrd"] Dec 04 10:00:41 crc kubenswrapper[4776]: E1204 10:00:41.227802 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92fb916-9d62-42d7-bad8-12cd43af37e9" containerName="barbican-db-sync" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.227823 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92fb916-9d62-42d7-bad8-12cd43af37e9" containerName="barbican-db-sync" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.228050 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92fb916-9d62-42d7-bad8-12cd43af37e9" containerName="barbican-db-sync" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.229341 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.233545 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.233903 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hjjs6" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.237576 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.261269 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-854556677-xxtrd"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.304325 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5bddcbff8b-x8j6m"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.307810 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.313396 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.385320 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bddcbff8b-x8j6m"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392116 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-combined-ca-bundle\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392173 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-combined-ca-bundle\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392218 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-config-data\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392241 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk95t\" (UniqueName: \"kubernetes.io/projected/e476a541-1b98-470c-adf7-812cc06763e1-kube-api-access-fk95t\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392272 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476a541-1b98-470c-adf7-812cc06763e1-logs\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392314 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-config-data-custom\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2bs\" (UniqueName: \"kubernetes.io/projected/9d48466a-6e63-429a-aba8-cc93741041f4-kube-api-access-sj2bs\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392349 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-config-data\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392393 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d48466a-6e63-429a-aba8-cc93741041f4-logs\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.392415 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-config-data-custom\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.411468 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-f8rhj"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.411773 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerName="dnsmasq-dns" containerID="cri-o://88a60914ee32017d04acb27b1f6f51432445649f9c5bb75c8540c2775e979a35" gracePeriod=10 Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.415089 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.444371 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-4c89s"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.454374 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.454825 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4bt5q" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.496957 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-combined-ca-bundle\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.496998 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-combined-ca-bundle\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497044 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-config-data\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497068 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk95t\" (UniqueName: \"kubernetes.io/projected/e476a541-1b98-470c-adf7-812cc06763e1-kube-api-access-fk95t\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476a541-1b98-470c-adf7-812cc06763e1-logs\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497140 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-config-data-custom\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497155 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2bs\" (UniqueName: \"kubernetes.io/projected/9d48466a-6e63-429a-aba8-cc93741041f4-kube-api-access-sj2bs\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497172 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-config-data\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d48466a-6e63-429a-aba8-cc93741041f4-logs\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-config-data-custom\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.497888 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e476a541-1b98-470c-adf7-812cc06763e1-logs\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.501052 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d48466a-6e63-429a-aba8-cc93741041f4-logs\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.511139 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-combined-ca-bundle\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.514238 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-config-data\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.528164 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-config-data\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.529629 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-config-data-custom\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.530244 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d48466a-6e63-429a-aba8-cc93741041f4-combined-ca-bundle\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.543106 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e476a541-1b98-470c-adf7-812cc06763e1-config-data-custom\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.586522 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk95t\" (UniqueName: \"kubernetes.io/projected/e476a541-1b98-470c-adf7-812cc06763e1-kube-api-access-fk95t\") pod \"barbican-keystone-listener-5bddcbff8b-x8j6m\" (UID: \"e476a541-1b98-470c-adf7-812cc06763e1\") " pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.608557 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-scripts\") pod \"8765b919-d724-4148-8ba8-a550cd8029fc\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.608649 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-config-data\") pod \"8765b919-d724-4148-8ba8-a550cd8029fc\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.608694 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-combined-ca-bundle\") pod \"8765b919-d724-4148-8ba8-a550cd8029fc\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.608767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8765b919-d724-4148-8ba8-a550cd8029fc-etc-machine-id\") pod \"8765b919-d724-4148-8ba8-a550cd8029fc\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.608800 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp6d9\" (UniqueName: \"kubernetes.io/projected/8765b919-d724-4148-8ba8-a550cd8029fc-kube-api-access-fp6d9\") pod \"8765b919-d724-4148-8ba8-a550cd8029fc\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.608865 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-db-sync-config-data\") pod \"8765b919-d724-4148-8ba8-a550cd8029fc\" (UID: \"8765b919-d724-4148-8ba8-a550cd8029fc\") " Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.609259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.609331 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8php\" (UniqueName: \"kubernetes.io/projected/f71bf152-d681-4574-b97b-82b8c82f3f96-kube-api-access-d8php\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.609454 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-config\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.609486 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.609504 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-dns-svc\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.611554 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2bs\" (UniqueName: \"kubernetes.io/projected/9d48466a-6e63-429a-aba8-cc93741041f4-kube-api-access-sj2bs\") pod \"barbican-worker-854556677-xxtrd\" (UID: \"9d48466a-6e63-429a-aba8-cc93741041f4\") " pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.613824 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8765b919-d724-4148-8ba8-a550cd8029fc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8765b919-d724-4148-8ba8-a550cd8029fc" (UID: "8765b919-d724-4148-8ba8-a550cd8029fc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.661442 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8765b919-d724-4148-8ba8-a550cd8029fc-kube-api-access-fp6d9" (OuterVolumeSpecName: "kube-api-access-fp6d9") pod "8765b919-d724-4148-8ba8-a550cd8029fc" (UID: "8765b919-d724-4148-8ba8-a550cd8029fc"). InnerVolumeSpecName "kube-api-access-fp6d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.664261 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8765b919-d724-4148-8ba8-a550cd8029fc" (UID: "8765b919-d724-4148-8ba8-a550cd8029fc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.687049 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-scripts" (OuterVolumeSpecName: "scripts") pod "8765b919-d724-4148-8ba8-a550cd8029fc" (UID: "8765b919-d724-4148-8ba8-a550cd8029fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.716632 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-854556677-xxtrd" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.725888 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.725992 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8php\" (UniqueName: \"kubernetes.io/projected/f71bf152-d681-4574-b97b-82b8c82f3f96-kube-api-access-d8php\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726072 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-config\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726118 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-dns-svc\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726168 4776 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726181 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726190 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8765b919-d724-4148-8ba8-a550cd8029fc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.726198 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp6d9\" (UniqueName: \"kubernetes.io/projected/8765b919-d724-4148-8ba8-a550cd8029fc-kube-api-access-fp6d9\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.727228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-dns-svc\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.727847 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.728969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-config\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.729534 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.751670 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8765b919-d724-4148-8ba8-a550cd8029fc" (UID: "8765b919-d724-4148-8ba8-a550cd8029fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.770687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8php\" (UniqueName: \"kubernetes.io/projected/f71bf152-d681-4574-b97b-82b8c82f3f96-kube-api-access-d8php\") pod \"dnsmasq-dns-6bb684768f-4c89s\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.772146 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-4c89s"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.784811 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.788696 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5889fdcbc8-cklvv"] Dec 04 10:00:41 crc kubenswrapper[4776]: E1204 10:00:41.789193 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8765b919-d724-4148-8ba8-a550cd8029fc" containerName="cinder-db-sync" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.789216 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8765b919-d724-4148-8ba8-a550cd8029fc" containerName="cinder-db-sync" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.789413 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8765b919-d724-4148-8ba8-a550cd8029fc" containerName="cinder-db-sync" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.790474 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.795390 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.803472 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.812642 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5889fdcbc8-cklvv"] Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.875185 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-config-data" (OuterVolumeSpecName: "config-data") pod "8765b919-d724-4148-8ba8-a550cd8029fc" (UID: "8765b919-d724-4148-8ba8-a550cd8029fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.876395 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.876453 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8765b919-d724-4148-8ba8-a550cd8029fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.934445 4776 generic.go:334] "Generic (PLEG): container finished" podID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerID="88a60914ee32017d04acb27b1f6f51432445649f9c5bb75c8540c2775e979a35" exitCode=0 Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.934544 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" event={"ID":"299f0b77-37c3-4d06-9e8b-00b52cdc5899","Type":"ContainerDied","Data":"88a60914ee32017d04acb27b1f6f51432445649f9c5bb75c8540c2775e979a35"} Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.959515 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4bt5q" event={"ID":"8765b919-d724-4148-8ba8-a550cd8029fc","Type":"ContainerDied","Data":"35da4911858a3384dbebfe6dba44c45fa85a55230c97ce977587cba5ca3a27d3"} Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.959557 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35da4911858a3384dbebfe6dba44c45fa85a55230c97ce977587cba5ca3a27d3" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.959639 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4bt5q" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.979352 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data-custom\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.979561 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-combined-ca-bundle\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.979596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.979649 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e23d8-9b9b-4488-9f4b-2060748f6966-logs\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:41 crc kubenswrapper[4776]: I1204 10:00:41.979705 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnlh\" (UniqueName: \"kubernetes.io/projected/058e23d8-9b9b-4488-9f4b-2060748f6966-kube-api-access-vpnlh\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.081112 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data-custom\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.081297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-combined-ca-bundle\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.081328 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.081363 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e23d8-9b9b-4488-9f4b-2060748f6966-logs\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.081422 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnlh\" (UniqueName: \"kubernetes.io/projected/058e23d8-9b9b-4488-9f4b-2060748f6966-kube-api-access-vpnlh\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.083436 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e23d8-9b9b-4488-9f4b-2060748f6966-logs\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.093778 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-combined-ca-bundle\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.094099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.094423 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data-custom\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.117606 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnlh\" (UniqueName: \"kubernetes.io/projected/058e23d8-9b9b-4488-9f4b-2060748f6966-kube-api-access-vpnlh\") pod \"barbican-api-5889fdcbc8-cklvv\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.197117 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.229184 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.233978 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.246145 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.247234 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.247462 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xdgz8" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.247585 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.269009 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.376490 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-4c89s"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.391398 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.391478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpcg\" (UniqueName: \"kubernetes.io/projected/b1d00dde-5442-4613-84c3-6e959bb609f2-kube-api-access-hhpcg\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.391541 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d00dde-5442-4613-84c3-6e959bb609f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.391560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.391814 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.391874 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.433503 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7lwkn"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.442331 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.465731 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-854556677-xxtrd"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.486440 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7lwkn"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.493681 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-config\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.493864 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494028 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494056 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494104 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpcg\" (UniqueName: \"kubernetes.io/projected/b1d00dde-5442-4613-84c3-6e959bb609f2-kube-api-access-hhpcg\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494168 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d00dde-5442-4613-84c3-6e959bb609f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494183 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.494228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4s9\" (UniqueName: \"kubernetes.io/projected/0b3b18f8-cccb-4159-a2d5-19f75959e6da-kube-api-access-pc4s9\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.498627 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d00dde-5442-4613-84c3-6e959bb609f2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.499948 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.509431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.509498 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.518024 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.520214 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.539118 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpcg\" (UniqueName: \"kubernetes.io/projected/b1d00dde-5442-4613-84c3-6e959bb609f2-kube-api-access-hhpcg\") pod \"cinder-scheduler-0\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.582530 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.585316 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.590676 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.602586 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data-custom\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.603220 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4s9\" (UniqueName: \"kubernetes.io/projected/0b3b18f8-cccb-4159-a2d5-19f75959e6da-kube-api-access-pc4s9\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.603762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-scripts\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604189 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604237 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-config\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604354 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/627d090b-c706-469f-9370-f06c1a9d7e89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604433 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64k8z\" (UniqueName: \"kubernetes.io/projected/627d090b-c706-469f-9370-f06c1a9d7e89-kube-api-access-64k8z\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604469 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604491 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604557 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627d090b-c706-469f-9370-f06c1a9d7e89-logs\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.604589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.607721 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.608076 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-config\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.608569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.609041 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.609658 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.626059 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.631855 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4s9\" (UniqueName: \"kubernetes.io/projected/0b3b18f8-cccb-4159-a2d5-19f75959e6da-kube-api-access-pc4s9\") pod \"dnsmasq-dns-6d97fcdd8f-7lwkn\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.696060 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-4c89s"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.703403 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.705805 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-sb\") pod \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.705950 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-config\") pod \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706002 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-nb\") pod \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706091 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-dns-svc\") pod \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706120 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t276q\" (UniqueName: \"kubernetes.io/projected/299f0b77-37c3-4d06-9e8b-00b52cdc5899-kube-api-access-t276q\") pod \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\" (UID: \"299f0b77-37c3-4d06-9e8b-00b52cdc5899\") " Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706440 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-scripts\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706502 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706565 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/627d090b-c706-469f-9370-f06c1a9d7e89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706608 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64k8z\" (UniqueName: \"kubernetes.io/projected/627d090b-c706-469f-9370-f06c1a9d7e89-kube-api-access-64k8z\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706685 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627d090b-c706-469f-9370-f06c1a9d7e89-logs\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706722 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.706754 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data-custom\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.707565 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/627d090b-c706-469f-9370-f06c1a9d7e89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.716456 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627d090b-c706-469f-9370-f06c1a9d7e89-logs\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.718471 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.724420 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-scripts\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.729884 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data-custom\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.734473 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.737857 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64k8z\" (UniqueName: \"kubernetes.io/projected/627d090b-c706-469f-9370-f06c1a9d7e89-kube-api-access-64k8z\") pod \"cinder-api-0\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.738221 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299f0b77-37c3-4d06-9e8b-00b52cdc5899-kube-api-access-t276q" (OuterVolumeSpecName: "kube-api-access-t276q") pod "299f0b77-37c3-4d06-9e8b-00b52cdc5899" (UID: "299f0b77-37c3-4d06-9e8b-00b52cdc5899"). InnerVolumeSpecName "kube-api-access-t276q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.793683 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "299f0b77-37c3-4d06-9e8b-00b52cdc5899" (UID: "299f0b77-37c3-4d06-9e8b-00b52cdc5899"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.807086 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bddcbff8b-x8j6m"] Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.808350 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t276q\" (UniqueName: \"kubernetes.io/projected/299f0b77-37c3-4d06-9e8b-00b52cdc5899-kube-api-access-t276q\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.808378 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.817401 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-config" (OuterVolumeSpecName: "config") pod "299f0b77-37c3-4d06-9e8b-00b52cdc5899" (UID: "299f0b77-37c3-4d06-9e8b-00b52cdc5899"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.825592 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "299f0b77-37c3-4d06-9e8b-00b52cdc5899" (UID: "299f0b77-37c3-4d06-9e8b-00b52cdc5899"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.830538 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "299f0b77-37c3-4d06-9e8b-00b52cdc5899" (UID: "299f0b77-37c3-4d06-9e8b-00b52cdc5899"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.902889 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.909936 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.909972 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.909986 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/299f0b77-37c3-4d06-9e8b-00b52cdc5899-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.936342 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.987793 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-854556677-xxtrd" event={"ID":"9d48466a-6e63-429a-aba8-cc93741041f4","Type":"ContainerStarted","Data":"b1d4d9377f3be370475ff308e876de245577c3993b1279d3eec3adf5174f8b0d"} Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.999784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" event={"ID":"299f0b77-37c3-4d06-9e8b-00b52cdc5899","Type":"ContainerDied","Data":"ec64924e65e09260064e0b669fecc242cfb054fc49068cc7dafce73bdfd74982"} Dec 04 10:00:42 crc kubenswrapper[4776]: I1204 10:00:42.999855 4776 scope.go:117] "RemoveContainer" containerID="88a60914ee32017d04acb27b1f6f51432445649f9c5bb75c8540c2775e979a35" Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.000069 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-f8rhj" Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.016255 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" event={"ID":"e476a541-1b98-470c-adf7-812cc06763e1","Type":"ContainerStarted","Data":"d30c7bb56b43e42b2c60afae1c8365f045895ef77642bd50597669565c30a35e"} Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.027341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" event={"ID":"f71bf152-d681-4574-b97b-82b8c82f3f96","Type":"ContainerStarted","Data":"bac4cb4dae27dd2f4426ced2b7b493543d1c59700d3f3b85d9c84ed8246de140"} Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.041883 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerStarted","Data":"528db00aae47b8937035bb07d6facb53b09b42e3bf0784e3fe161ce0fb3c9ae6"} Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.055534 4776 scope.go:117] "RemoveContainer" containerID="37ead3e24e70c43f4465288ca36a0c84235e417a9fe4d13592f23fe789c38de5" Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.070438 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-f8rhj"] Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.099209 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-f8rhj"] Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.117641 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5889fdcbc8-cklvv"] Dec 04 10:00:43 crc kubenswrapper[4776]: W1204 10:00:43.188009 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod058e23d8_9b9b_4488_9f4b_2060748f6966.slice/crio-af95ac553d74c5466a444280f97a7b367e061fb95380dd3db4e9972e8ed5b3de WatchSource:0}: Error finding container af95ac553d74c5466a444280f97a7b367e061fb95380dd3db4e9972e8ed5b3de: Status 404 returned error can't find the container with id af95ac553d74c5466a444280f97a7b367e061fb95380dd3db4e9972e8ed5b3de Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.420101 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7lwkn"] Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.476622 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" path="/var/lib/kubelet/pods/299f0b77-37c3-4d06-9e8b-00b52cdc5899/volumes" Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.477320 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:43 crc kubenswrapper[4776]: W1204 10:00:43.496197 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d00dde_5442_4613_84c3_6e959bb609f2.slice/crio-0ee4d6766b2c134b51a33f10478f1667895eb3f0493fcd46223140da838c63ee WatchSource:0}: Error finding container 0ee4d6766b2c134b51a33f10478f1667895eb3f0493fcd46223140da838c63ee: Status 404 returned error can't find the container with id 0ee4d6766b2c134b51a33f10478f1667895eb3f0493fcd46223140da838c63ee Dec 04 10:00:43 crc kubenswrapper[4776]: I1204 10:00:43.736277 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.058316 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5889fdcbc8-cklvv" event={"ID":"058e23d8-9b9b-4488-9f4b-2060748f6966","Type":"ContainerStarted","Data":"c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.058602 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5889fdcbc8-cklvv" event={"ID":"058e23d8-9b9b-4488-9f4b-2060748f6966","Type":"ContainerStarted","Data":"af95ac553d74c5466a444280f97a7b367e061fb95380dd3db4e9972e8ed5b3de"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.059431 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d00dde-5442-4613-84c3-6e959bb609f2","Type":"ContainerStarted","Data":"0ee4d6766b2c134b51a33f10478f1667895eb3f0493fcd46223140da838c63ee"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.065537 4776 generic.go:334] "Generic (PLEG): container finished" podID="f71bf152-d681-4574-b97b-82b8c82f3f96" containerID="bb15fcb8ef9ba39b4bacc41d910f7fa1178ee16f74ef09beaffed6817e9e11c6" exitCode=0 Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.065617 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" event={"ID":"f71bf152-d681-4574-b97b-82b8c82f3f96","Type":"ContainerDied","Data":"bb15fcb8ef9ba39b4bacc41d910f7fa1178ee16f74ef09beaffed6817e9e11c6"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.073506 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerStarted","Data":"32f69bbdee2fdb20e5603c82c042d018b1f39eee0e7a111f6abfcefeb564b013"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.074042 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.081375 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerID="9c334638aab625c2458d38d75d5016e4113a05bb82e5e514f14b63b53b658bc1" exitCode=0 Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.081497 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" event={"ID":"0b3b18f8-cccb-4159-a2d5-19f75959e6da","Type":"ContainerDied","Data":"9c334638aab625c2458d38d75d5016e4113a05bb82e5e514f14b63b53b658bc1"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.081527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" event={"ID":"0b3b18f8-cccb-4159-a2d5-19f75959e6da","Type":"ContainerStarted","Data":"34918026f87ff7cec4c7949478a4fd3d9279a5f075e2885ee7834d063b5abb3b"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.085884 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"627d090b-c706-469f-9370-f06c1a9d7e89","Type":"ContainerStarted","Data":"0b7c250048dfd4a24dfb3e670db7323ad31583df16409e4ec7dd9b74fddfe271"} Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.154013 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.621248265 podStartE2EDuration="7.153989806s" podCreationTimestamp="2025-12-04 10:00:37 +0000 UTC" firstStartedPulling="2025-12-04 10:00:38.805639439 +0000 UTC m=+1283.672119816" lastFinishedPulling="2025-12-04 10:00:43.33838097 +0000 UTC m=+1288.204861357" observedRunningTime="2025-12-04 10:00:44.148796514 +0000 UTC m=+1289.015276891" watchObservedRunningTime="2025-12-04 10:00:44.153989806 +0000 UTC m=+1289.020470193" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.476146 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.589230 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-config\") pod \"f71bf152-d681-4574-b97b-82b8c82f3f96\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.589328 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-dns-svc\") pod \"f71bf152-d681-4574-b97b-82b8c82f3f96\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.589404 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8php\" (UniqueName: \"kubernetes.io/projected/f71bf152-d681-4574-b97b-82b8c82f3f96-kube-api-access-d8php\") pod \"f71bf152-d681-4574-b97b-82b8c82f3f96\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.590120 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-nb\") pod \"f71bf152-d681-4574-b97b-82b8c82f3f96\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.590480 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-sb\") pod \"f71bf152-d681-4574-b97b-82b8c82f3f96\" (UID: \"f71bf152-d681-4574-b97b-82b8c82f3f96\") " Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.595200 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71bf152-d681-4574-b97b-82b8c82f3f96-kube-api-access-d8php" (OuterVolumeSpecName: "kube-api-access-d8php") pod "f71bf152-d681-4574-b97b-82b8c82f3f96" (UID: "f71bf152-d681-4574-b97b-82b8c82f3f96"). InnerVolumeSpecName "kube-api-access-d8php". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.616608 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-config" (OuterVolumeSpecName: "config") pod "f71bf152-d681-4574-b97b-82b8c82f3f96" (UID: "f71bf152-d681-4574-b97b-82b8c82f3f96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.634481 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f71bf152-d681-4574-b97b-82b8c82f3f96" (UID: "f71bf152-d681-4574-b97b-82b8c82f3f96"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.649787 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f71bf152-d681-4574-b97b-82b8c82f3f96" (UID: "f71bf152-d681-4574-b97b-82b8c82f3f96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.652228 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f71bf152-d681-4574-b97b-82b8c82f3f96" (UID: "f71bf152-d681-4574-b97b-82b8c82f3f96"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.692754 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.692795 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.692810 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8php\" (UniqueName: \"kubernetes.io/projected/f71bf152-d681-4574-b97b-82b8c82f3f96-kube-api-access-d8php\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.692828 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:44 crc kubenswrapper[4776]: I1204 10:00:44.692841 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f71bf152-d681-4574-b97b-82b8c82f3f96-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.097691 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" event={"ID":"f71bf152-d681-4574-b97b-82b8c82f3f96","Type":"ContainerDied","Data":"bac4cb4dae27dd2f4426ced2b7b493543d1c59700d3f3b85d9c84ed8246de140"} Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.098103 4776 scope.go:117] "RemoveContainer" containerID="bb15fcb8ef9ba39b4bacc41d910f7fa1178ee16f74ef09beaffed6817e9e11c6" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.097894 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-4c89s" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.118093 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" event={"ID":"0b3b18f8-cccb-4159-a2d5-19f75959e6da","Type":"ContainerStarted","Data":"0e97f76ef4550100d2b4b29faa12d8e727c7c0d7e889b30a612362b823b1f06d"} Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.119224 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.127422 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"627d090b-c706-469f-9370-f06c1a9d7e89","Type":"ContainerStarted","Data":"874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77"} Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.130954 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5889fdcbc8-cklvv" event={"ID":"058e23d8-9b9b-4488-9f4b-2060748f6966","Type":"ContainerStarted","Data":"b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116"} Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.131009 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.131027 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.186996 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-4c89s"] Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.193128 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-4c89s"] Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.207981 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5889fdcbc8-cklvv" podStartSLOduration=4.207943422 podStartE2EDuration="4.207943422s" podCreationTimestamp="2025-12-04 10:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:45.198810116 +0000 UTC m=+1290.065290493" watchObservedRunningTime="2025-12-04 10:00:45.207943422 +0000 UTC m=+1290.074423799" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.243486 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" podStartSLOduration=3.243466255 podStartE2EDuration="3.243466255s" podCreationTimestamp="2025-12-04 10:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:45.240714429 +0000 UTC m=+1290.107194806" watchObservedRunningTime="2025-12-04 10:00:45.243466255 +0000 UTC m=+1290.109946632" Dec 04 10:00:45 crc kubenswrapper[4776]: I1204 10:00:45.498398 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71bf152-d681-4574-b97b-82b8c82f3f96" path="/var/lib/kubelet/pods/f71bf152-d681-4574-b97b-82b8c82f3f96/volumes" Dec 04 10:00:46 crc kubenswrapper[4776]: I1204 10:00:46.150600 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d00dde-5442-4613-84c3-6e959bb609f2","Type":"ContainerStarted","Data":"cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964"} Dec 04 10:00:46 crc kubenswrapper[4776]: I1204 10:00:46.172561 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.175683 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"627d090b-c706-469f-9370-f06c1a9d7e89","Type":"ContainerStarted","Data":"3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a"} Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.175837 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api" containerID="cri-o://3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a" gracePeriod=30 Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.175841 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api-log" containerID="cri-o://874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77" gracePeriod=30 Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.176321 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.185412 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d00dde-5442-4613-84c3-6e959bb609f2","Type":"ContainerStarted","Data":"47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae"} Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.199438 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" event={"ID":"e476a541-1b98-470c-adf7-812cc06763e1","Type":"ContainerStarted","Data":"d13833f91a852f386d60ac5c7a4278063926df51b5310a4d29c276cc938577ce"} Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.199487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" event={"ID":"e476a541-1b98-470c-adf7-812cc06763e1","Type":"ContainerStarted","Data":"4f8d27d44418f8f058e1a7525333a012811aa724b01da11707dadd4e305789cf"} Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.204047 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-854556677-xxtrd" event={"ID":"9d48466a-6e63-429a-aba8-cc93741041f4","Type":"ContainerStarted","Data":"0e04745cf48ac935fb492da4aba1ad84d260a1bf7a284b02d9e302d0d57f6161"} Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.204100 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-854556677-xxtrd" event={"ID":"9d48466a-6e63-429a-aba8-cc93741041f4","Type":"ContainerStarted","Data":"31db809ef6b3f1299fbbf5aa7cc2b3418b055446a68edbd2eed28f4d5df18540"} Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.219829 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.219806323 podStartE2EDuration="5.219806323s" podCreationTimestamp="2025-12-04 10:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:47.206338211 +0000 UTC m=+1292.072818598" watchObservedRunningTime="2025-12-04 10:00:47.219806323 +0000 UTC m=+1292.086286700" Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.250226 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-854556677-xxtrd" podStartSLOduration=2.698989488 podStartE2EDuration="6.250210595s" podCreationTimestamp="2025-12-04 10:00:41 +0000 UTC" firstStartedPulling="2025-12-04 10:00:42.554950581 +0000 UTC m=+1287.421430958" lastFinishedPulling="2025-12-04 10:00:46.106171688 +0000 UTC m=+1290.972652065" observedRunningTime="2025-12-04 10:00:47.244674822 +0000 UTC m=+1292.111155209" watchObservedRunningTime="2025-12-04 10:00:47.250210595 +0000 UTC m=+1292.116690962" Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.315334 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.474144778 podStartE2EDuration="5.315314186s" podCreationTimestamp="2025-12-04 10:00:42 +0000 UTC" firstStartedPulling="2025-12-04 10:00:43.502072859 +0000 UTC m=+1288.368553246" lastFinishedPulling="2025-12-04 10:00:44.343242277 +0000 UTC m=+1289.209722654" observedRunningTime="2025-12-04 10:00:47.283999314 +0000 UTC m=+1292.150479701" watchObservedRunningTime="2025-12-04 10:00:47.315314186 +0000 UTC m=+1292.181794563" Dec 04 10:00:47 crc kubenswrapper[4776]: I1204 10:00:47.705116 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.242742 4776 generic.go:334] "Generic (PLEG): container finished" podID="627d090b-c706-469f-9370-f06c1a9d7e89" containerID="874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77" exitCode=143 Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.243682 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"627d090b-c706-469f-9370-f06c1a9d7e89","Type":"ContainerDied","Data":"874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77"} Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.601159 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5bddcbff8b-x8j6m" podStartSLOduration=4.347452334 podStartE2EDuration="7.601132097s" podCreationTimestamp="2025-12-04 10:00:41 +0000 UTC" firstStartedPulling="2025-12-04 10:00:42.83349105 +0000 UTC m=+1287.699971427" lastFinishedPulling="2025-12-04 10:00:46.087170813 +0000 UTC m=+1290.953651190" observedRunningTime="2025-12-04 10:00:47.32438857 +0000 UTC m=+1292.190868977" watchObservedRunningTime="2025-12-04 10:00:48.601132097 +0000 UTC m=+1293.467612474" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.609581 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-555f995688-x45jv"] Dec 04 10:00:48 crc kubenswrapper[4776]: E1204 10:00:48.610107 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71bf152-d681-4574-b97b-82b8c82f3f96" containerName="init" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.610130 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71bf152-d681-4574-b97b-82b8c82f3f96" containerName="init" Dec 04 10:00:48 crc kubenswrapper[4776]: E1204 10:00:48.610151 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerName="init" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.610160 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerName="init" Dec 04 10:00:48 crc kubenswrapper[4776]: E1204 10:00:48.610174 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerName="dnsmasq-dns" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.610183 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerName="dnsmasq-dns" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.610393 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="299f0b77-37c3-4d06-9e8b-00b52cdc5899" containerName="dnsmasq-dns" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.610429 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71bf152-d681-4574-b97b-82b8c82f3f96" containerName="init" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.611554 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.628672 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.636006 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-555f995688-x45jv"] Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.635978 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694281 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-internal-tls-certs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694474 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-config-data\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694568 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71750c70-c6f5-441b-8dae-2c78f53f5e0f-logs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694664 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhbf\" (UniqueName: \"kubernetes.io/projected/71750c70-c6f5-441b-8dae-2c78f53f5e0f-kube-api-access-ckhbf\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694720 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-config-data-custom\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694745 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-public-tls-certs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.694828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-combined-ca-bundle\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-config-data\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71750c70-c6f5-441b-8dae-2c78f53f5e0f-logs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797158 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhbf\" (UniqueName: \"kubernetes.io/projected/71750c70-c6f5-441b-8dae-2c78f53f5e0f-kube-api-access-ckhbf\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-config-data-custom\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-public-tls-certs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797258 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-combined-ca-bundle\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797288 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-internal-tls-certs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.797638 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71750c70-c6f5-441b-8dae-2c78f53f5e0f-logs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.805945 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-internal-tls-certs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.823708 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-combined-ca-bundle\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.824007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-config-data\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.824413 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhbf\" (UniqueName: \"kubernetes.io/projected/71750c70-c6f5-441b-8dae-2c78f53f5e0f-kube-api-access-ckhbf\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.824813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-config-data-custom\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.824851 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71750c70-c6f5-441b-8dae-2c78f53f5e0f-public-tls-certs\") pod \"barbican-api-555f995688-x45jv\" (UID: \"71750c70-c6f5-441b-8dae-2c78f53f5e0f\") " pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:48 crc kubenswrapper[4776]: I1204 10:00:48.973742 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:49 crc kubenswrapper[4776]: I1204 10:00:49.380232 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:00:49 crc kubenswrapper[4776]: I1204 10:00:49.380627 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:00:49 crc kubenswrapper[4776]: I1204 10:00:49.467702 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-555f995688-x45jv"] Dec 04 10:00:49 crc kubenswrapper[4776]: W1204 10:00:49.471998 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71750c70_c6f5_441b_8dae_2c78f53f5e0f.slice/crio-86ebda80d0bc7ccbb772ff4ef59072cf75ac7108294fbeec2c6ab06051a759dd WatchSource:0}: Error finding container 86ebda80d0bc7ccbb772ff4ef59072cf75ac7108294fbeec2c6ab06051a759dd: Status 404 returned error can't find the container with id 86ebda80d0bc7ccbb772ff4ef59072cf75ac7108294fbeec2c6ab06051a759dd Dec 04 10:00:50 crc kubenswrapper[4776]: I1204 10:00:50.264343 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-555f995688-x45jv" event={"ID":"71750c70-c6f5-441b-8dae-2c78f53f5e0f","Type":"ContainerStarted","Data":"5fefa246833ec736c5010dd9cf1e6c86a3830de69178b8bef54e181f46001b8c"} Dec 04 10:00:50 crc kubenswrapper[4776]: I1204 10:00:50.264752 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:50 crc kubenswrapper[4776]: I1204 10:00:50.264769 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-555f995688-x45jv" event={"ID":"71750c70-c6f5-441b-8dae-2c78f53f5e0f","Type":"ContainerStarted","Data":"d980093367f5ed309795118ef0f60557243ccac5e9a8be61549176f571eab29b"} Dec 04 10:00:50 crc kubenswrapper[4776]: I1204 10:00:50.264781 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-555f995688-x45jv" event={"ID":"71750c70-c6f5-441b-8dae-2c78f53f5e0f","Type":"ContainerStarted","Data":"86ebda80d0bc7ccbb772ff4ef59072cf75ac7108294fbeec2c6ab06051a759dd"} Dec 04 10:00:51 crc kubenswrapper[4776]: I1204 10:00:51.276235 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:00:52 crc kubenswrapper[4776]: I1204 10:00:52.904148 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:00:52 crc kubenswrapper[4776]: I1204 10:00:52.956964 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-555f995688-x45jv" podStartSLOduration=4.9569432540000005 podStartE2EDuration="4.956943254s" podCreationTimestamp="2025-12-04 10:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:50.289884162 +0000 UTC m=+1295.156364549" watchObservedRunningTime="2025-12-04 10:00:52.956943254 +0000 UTC m=+1297.823423631" Dec 04 10:00:52 crc kubenswrapper[4776]: I1204 10:00:52.973274 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-s2n66"] Dec 04 10:00:52 crc kubenswrapper[4776]: I1204 10:00:52.974851 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" podUID="db382a72-b559-43b9-ab00-b843f38661a4" containerName="dnsmasq-dns" containerID="cri-o://e810eefc0b980ace4079c6069884c75e2ecfc3880d583641bf9995f185d899c8" gracePeriod=10 Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.028146 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.115998 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.302687 4776 generic.go:334] "Generic (PLEG): container finished" podID="db382a72-b559-43b9-ab00-b843f38661a4" containerID="e810eefc0b980ace4079c6069884c75e2ecfc3880d583641bf9995f185d899c8" exitCode=0 Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.302962 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="cinder-scheduler" containerID="cri-o://cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964" gracePeriod=30 Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.303234 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="probe" containerID="cri-o://47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae" gracePeriod=30 Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.303333 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" event={"ID":"db382a72-b559-43b9-ab00-b843f38661a4","Type":"ContainerDied","Data":"e810eefc0b980ace4079c6069884c75e2ecfc3880d583641bf9995f185d899c8"} Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.578310 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.697899 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-dns-svc\") pod \"db382a72-b559-43b9-ab00-b843f38661a4\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.698056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-nb\") pod \"db382a72-b559-43b9-ab00-b843f38661a4\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.698213 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-sb\") pod \"db382a72-b559-43b9-ab00-b843f38661a4\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.698429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfjvc\" (UniqueName: \"kubernetes.io/projected/db382a72-b559-43b9-ab00-b843f38661a4-kube-api-access-qfjvc\") pod \"db382a72-b559-43b9-ab00-b843f38661a4\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.698564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-config\") pod \"db382a72-b559-43b9-ab00-b843f38661a4\" (UID: \"db382a72-b559-43b9-ab00-b843f38661a4\") " Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.712147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db382a72-b559-43b9-ab00-b843f38661a4-kube-api-access-qfjvc" (OuterVolumeSpecName: "kube-api-access-qfjvc") pod "db382a72-b559-43b9-ab00-b843f38661a4" (UID: "db382a72-b559-43b9-ab00-b843f38661a4"). InnerVolumeSpecName "kube-api-access-qfjvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.754154 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db382a72-b559-43b9-ab00-b843f38661a4" (UID: "db382a72-b559-43b9-ab00-b843f38661a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.759683 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db382a72-b559-43b9-ab00-b843f38661a4" (UID: "db382a72-b559-43b9-ab00-b843f38661a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.785399 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db382a72-b559-43b9-ab00-b843f38661a4" (UID: "db382a72-b559-43b9-ab00-b843f38661a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.790191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-config" (OuterVolumeSpecName: "config") pod "db382a72-b559-43b9-ab00-b843f38661a4" (UID: "db382a72-b559-43b9-ab00-b843f38661a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.802833 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.802886 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.802903 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.803031 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfjvc\" (UniqueName: \"kubernetes.io/projected/db382a72-b559-43b9-ab00-b843f38661a4-kube-api-access-qfjvc\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.803043 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db382a72-b559-43b9-ab00-b843f38661a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:53 crc kubenswrapper[4776]: I1204 10:00:53.988015 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.041651 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:00:54 crc kubenswrapper[4776]: E1204 10:00:54.310658 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d00dde_5442_4613_84c3_6e959bb609f2.slice/crio-conmon-47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.323684 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" event={"ID":"db382a72-b559-43b9-ab00-b843f38661a4","Type":"ContainerDied","Data":"4eb00471b9bb2c7af9afe3bc9a623409bf4f7df2449058776632d31435985b79"} Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.323776 4776 scope.go:117] "RemoveContainer" containerID="e810eefc0b980ace4079c6069884c75e2ecfc3880d583641bf9995f185d899c8" Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.323927 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-s2n66" Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.328342 4776 generic.go:334] "Generic (PLEG): container finished" podID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerID="47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae" exitCode=0 Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.329001 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d00dde-5442-4613-84c3-6e959bb609f2","Type":"ContainerDied","Data":"47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae"} Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.370157 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-s2n66"] Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.376551 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-s2n66"] Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.381627 4776 scope.go:117] "RemoveContainer" containerID="cf7e2f6aa22121179ba4435262fa30c4b1be8acd6bcf92582a9aa0c4dc911055" Dec 04 10:00:54 crc kubenswrapper[4776]: I1204 10:00:54.874272 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.031654 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-combined-ca-bundle\") pod \"b1d00dde-5442-4613-84c3-6e959bb609f2\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.031730 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data\") pod \"b1d00dde-5442-4613-84c3-6e959bb609f2\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.031777 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhpcg\" (UniqueName: \"kubernetes.io/projected/b1d00dde-5442-4613-84c3-6e959bb609f2-kube-api-access-hhpcg\") pod \"b1d00dde-5442-4613-84c3-6e959bb609f2\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.031802 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data-custom\") pod \"b1d00dde-5442-4613-84c3-6e959bb609f2\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.031902 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d00dde-5442-4613-84c3-6e959bb609f2-etc-machine-id\") pod \"b1d00dde-5442-4613-84c3-6e959bb609f2\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.031960 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-scripts\") pod \"b1d00dde-5442-4613-84c3-6e959bb609f2\" (UID: \"b1d00dde-5442-4613-84c3-6e959bb609f2\") " Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.034156 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1d00dde-5442-4613-84c3-6e959bb609f2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1d00dde-5442-4613-84c3-6e959bb609f2" (UID: "b1d00dde-5442-4613-84c3-6e959bb609f2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.040158 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d00dde-5442-4613-84c3-6e959bb609f2-kube-api-access-hhpcg" (OuterVolumeSpecName: "kube-api-access-hhpcg") pod "b1d00dde-5442-4613-84c3-6e959bb609f2" (UID: "b1d00dde-5442-4613-84c3-6e959bb609f2"). InnerVolumeSpecName "kube-api-access-hhpcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.045788 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1d00dde-5442-4613-84c3-6e959bb609f2" (UID: "b1d00dde-5442-4613-84c3-6e959bb609f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.053103 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-scripts" (OuterVolumeSpecName: "scripts") pod "b1d00dde-5442-4613-84c3-6e959bb609f2" (UID: "b1d00dde-5442-4613-84c3-6e959bb609f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.113014 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1d00dde-5442-4613-84c3-6e959bb609f2" (UID: "b1d00dde-5442-4613-84c3-6e959bb609f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.135129 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.135171 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhpcg\" (UniqueName: \"kubernetes.io/projected/b1d00dde-5442-4613-84c3-6e959bb609f2-kube-api-access-hhpcg\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.135184 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.135197 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d00dde-5442-4613-84c3-6e959bb609f2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.135210 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.162106 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data" (OuterVolumeSpecName: "config-data") pod "b1d00dde-5442-4613-84c3-6e959bb609f2" (UID: "b1d00dde-5442-4613-84c3-6e959bb609f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.236569 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d00dde-5442-4613-84c3-6e959bb609f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.339731 4776 generic.go:334] "Generic (PLEG): container finished" podID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerID="cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964" exitCode=0 Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.339869 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d00dde-5442-4613-84c3-6e959bb609f2","Type":"ContainerDied","Data":"cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964"} Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.340451 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d00dde-5442-4613-84c3-6e959bb609f2","Type":"ContainerDied","Data":"0ee4d6766b2c134b51a33f10478f1667895eb3f0493fcd46223140da838c63ee"} Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.339945 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.340567 4776 scope.go:117] "RemoveContainer" containerID="47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.363675 4776 scope.go:117] "RemoveContainer" containerID="cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.385394 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.395152 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.413456 4776 scope.go:117] "RemoveContainer" containerID="47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae" Dec 04 10:00:55 crc kubenswrapper[4776]: E1204 10:00:55.416799 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae\": container with ID starting with 47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae not found: ID does not exist" containerID="47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.417091 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae"} err="failed to get container status \"47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae\": rpc error: code = NotFound desc = could not find container \"47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae\": container with ID starting with 47154485bc97ed1087725eaeadc84e40c51d2a4ce0a054662f3a6a92829b93ae not found: ID does not exist" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.417233 4776 scope.go:117] "RemoveContainer" containerID="cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964" Dec 04 10:00:55 crc kubenswrapper[4776]: E1204 10:00:55.418124 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964\": container with ID starting with cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964 not found: ID does not exist" containerID="cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.418170 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964"} err="failed to get container status \"cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964\": rpc error: code = NotFound desc = could not find container \"cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964\": container with ID starting with cdfee2f11d9e1793ddc988573339ab2c7660e3bd826afada03c6b95f9f9d0964 not found: ID does not exist" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.427871 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:55 crc kubenswrapper[4776]: E1204 10:00:55.428308 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="probe" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428328 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="probe" Dec 04 10:00:55 crc kubenswrapper[4776]: E1204 10:00:55.428340 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="cinder-scheduler" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428348 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="cinder-scheduler" Dec 04 10:00:55 crc kubenswrapper[4776]: E1204 10:00:55.428377 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db382a72-b559-43b9-ab00-b843f38661a4" containerName="dnsmasq-dns" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428384 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="db382a72-b559-43b9-ab00-b843f38661a4" containerName="dnsmasq-dns" Dec 04 10:00:55 crc kubenswrapper[4776]: E1204 10:00:55.428398 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db382a72-b559-43b9-ab00-b843f38661a4" containerName="init" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428405 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="db382a72-b559-43b9-ab00-b843f38661a4" containerName="init" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428565 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="probe" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428580 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" containerName="cinder-scheduler" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.428601 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="db382a72-b559-43b9-ab00-b843f38661a4" containerName="dnsmasq-dns" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.429612 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.434412 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.450776 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.514551 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d00dde-5442-4613-84c3-6e959bb609f2" path="/var/lib/kubelet/pods/b1d00dde-5442-4613-84c3-6e959bb609f2/volumes" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.515406 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db382a72-b559-43b9-ab00-b843f38661a4" path="/var/lib/kubelet/pods/db382a72-b559-43b9-ab00-b843f38661a4/volumes" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.549121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfdm\" (UniqueName: \"kubernetes.io/projected/732251b5-2be6-4542-89b4-e20649ec27d0-kube-api-access-vhfdm\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.549202 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.549254 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.549281 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.549324 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/732251b5-2be6-4542-89b4-e20649ec27d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.549345 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.651064 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.651164 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.651194 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.651244 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/732251b5-2be6-4542-89b4-e20649ec27d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.651263 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.651358 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfdm\" (UniqueName: \"kubernetes.io/projected/732251b5-2be6-4542-89b4-e20649ec27d0-kube-api-access-vhfdm\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.653037 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/732251b5-2be6-4542-89b4-e20649ec27d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.657157 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.657788 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.664274 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.679108 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732251b5-2be6-4542-89b4-e20649ec27d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.682562 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfdm\" (UniqueName: \"kubernetes.io/projected/732251b5-2be6-4542-89b4-e20649ec27d0-kube-api-access-vhfdm\") pod \"cinder-scheduler-0\" (UID: \"732251b5-2be6-4542-89b4-e20649ec27d0\") " pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.780113 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 10:00:55 crc kubenswrapper[4776]: I1204 10:00:55.858396 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 10:00:56 crc kubenswrapper[4776]: I1204 10:00:56.356137 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 10:00:57 crc kubenswrapper[4776]: I1204 10:00:57.173986 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-86f58b95b9-j2njt" Dec 04 10:00:57 crc kubenswrapper[4776]: I1204 10:00:57.394165 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"732251b5-2be6-4542-89b4-e20649ec27d0","Type":"ContainerStarted","Data":"f85cdaceea55d741e3f033cb5080bba6ef767c5589bc6b6be9d41b14e628a328"} Dec 04 10:00:57 crc kubenswrapper[4776]: I1204 10:00:57.394215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"732251b5-2be6-4542-89b4-e20649ec27d0","Type":"ContainerStarted","Data":"47871d4015358cfe354520c95123326c89f33eef5160cc8f2ae3f6d9fd3d9758"} Dec 04 10:00:58 crc kubenswrapper[4776]: I1204 10:00:58.405384 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"732251b5-2be6-4542-89b4-e20649ec27d0","Type":"ContainerStarted","Data":"cc785f812fb89a95a12e42bb5bcbf53e850f84b9ef9c9fec602f1548e8eb4167"} Dec 04 10:00:58 crc kubenswrapper[4776]: I1204 10:00:58.442955 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.442912137 podStartE2EDuration="3.442912137s" podCreationTimestamp="2025-12-04 10:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:00:58.430273099 +0000 UTC m=+1303.296753466" watchObservedRunningTime="2025-12-04 10:00:58.442912137 +0000 UTC m=+1303.309392524" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.138249 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414041-vtpvx"] Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.140030 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.147342 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414041-vtpvx"] Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.188868 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.190226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.193622 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.193943 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.194113 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4gn6l" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.219807 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.252608 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-combined-ca-bundle\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.252699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-fernet-keys\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.252802 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jfr\" (UniqueName: \"kubernetes.io/projected/9d261d84-4a7d-4b97-bffa-be0cae0c8102-kube-api-access-n4jfr\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.252878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-config-data\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.355142 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-openstack-config\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.355376 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-combined-ca-bundle\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.355590 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.355645 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-fernet-keys\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.355862 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jfr\" (UniqueName: \"kubernetes.io/projected/9d261d84-4a7d-4b97-bffa-be0cae0c8102-kube-api-access-n4jfr\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.356023 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.356187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-config-data\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.356334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbkj\" (UniqueName: \"kubernetes.io/projected/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-kube-api-access-bgbkj\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.374265 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-combined-ca-bundle\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.374300 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-fernet-keys\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.374431 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-config-data\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.384606 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jfr\" (UniqueName: \"kubernetes.io/projected/9d261d84-4a7d-4b97-bffa-be0cae0c8102-kube-api-access-n4jfr\") pod \"keystone-cron-29414041-vtpvx\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.458825 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-openstack-config\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.459009 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.459056 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.459122 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbkj\" (UniqueName: \"kubernetes.io/projected/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-kube-api-access-bgbkj\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.459836 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-openstack-config\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.463529 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.464099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.467391 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.484363 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbkj\" (UniqueName: \"kubernetes.io/projected/5d93526f-f97b-4a2a-98b4-4b880a99cbd7-kube-api-access-bgbkj\") pod \"openstackclient\" (UID: \"5d93526f-f97b-4a2a-98b4-4b880a99cbd7\") " pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.515668 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.761690 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:01:00 crc kubenswrapper[4776]: I1204 10:01:00.781810 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.060075 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414041-vtpvx"] Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.079660 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-555f995688-x45jv" Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.155515 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5889fdcbc8-cklvv"] Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.155794 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5889fdcbc8-cklvv" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api-log" containerID="cri-o://c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4" gracePeriod=30 Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.156113 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5889fdcbc8-cklvv" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api" containerID="cri-o://b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116" gracePeriod=30 Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.182983 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.440348 4776 generic.go:334] "Generic (PLEG): container finished" podID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerID="c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4" exitCode=143 Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.440450 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5889fdcbc8-cklvv" event={"ID":"058e23d8-9b9b-4488-9f4b-2060748f6966","Type":"ContainerDied","Data":"c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4"} Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.443063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414041-vtpvx" event={"ID":"9d261d84-4a7d-4b97-bffa-be0cae0c8102","Type":"ContainerStarted","Data":"3bf689d24a365adc6d27975a3859aeed30d40d6e72f56df4bc04ed2722cc33e5"} Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.443112 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414041-vtpvx" event={"ID":"9d261d84-4a7d-4b97-bffa-be0cae0c8102","Type":"ContainerStarted","Data":"2b353ab9750c629b3b9f2ecbaeaacb45ce6e7fe4b8a1bec9fb7643e231a77dfa"} Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.446435 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d93526f-f97b-4a2a-98b4-4b880a99cbd7","Type":"ContainerStarted","Data":"858a1ed050388dafb83a2429230d105f423e89cf885af8c5a6109868fe9ee320"} Dec 04 10:01:01 crc kubenswrapper[4776]: I1204 10:01:01.467662 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414041-vtpvx" podStartSLOduration=1.467640898 podStartE2EDuration="1.467640898s" podCreationTimestamp="2025-12-04 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:01.457993784 +0000 UTC m=+1306.324474171" watchObservedRunningTime="2025-12-04 10:01:01.467640898 +0000 UTC m=+1306.334121275" Dec 04 10:01:03 crc kubenswrapper[4776]: I1204 10:01:03.959721 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:01:04 crc kubenswrapper[4776]: I1204 10:01:04.310287 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5889fdcbc8-cklvv" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:44840->10.217.0.148:9311: read: connection reset by peer" Dec 04 10:01:04 crc kubenswrapper[4776]: I1204 10:01:04.310335 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5889fdcbc8-cklvv" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:44828->10.217.0.148:9311: read: connection reset by peer" Dec 04 10:01:04 crc kubenswrapper[4776]: I1204 10:01:04.549672 4776 generic.go:334] "Generic (PLEG): container finished" podID="9d261d84-4a7d-4b97-bffa-be0cae0c8102" containerID="3bf689d24a365adc6d27975a3859aeed30d40d6e72f56df4bc04ed2722cc33e5" exitCode=0 Dec 04 10:01:04 crc kubenswrapper[4776]: I1204 10:01:04.549725 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414041-vtpvx" event={"ID":"9d261d84-4a7d-4b97-bffa-be0cae0c8102","Type":"ContainerDied","Data":"3bf689d24a365adc6d27975a3859aeed30d40d6e72f56df4bc04ed2722cc33e5"} Dec 04 10:01:04 crc kubenswrapper[4776]: I1204 10:01:04.973405 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.143894 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data\") pod \"058e23d8-9b9b-4488-9f4b-2060748f6966\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.144045 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-combined-ca-bundle\") pod \"058e23d8-9b9b-4488-9f4b-2060748f6966\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.144136 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpnlh\" (UniqueName: \"kubernetes.io/projected/058e23d8-9b9b-4488-9f4b-2060748f6966-kube-api-access-vpnlh\") pod \"058e23d8-9b9b-4488-9f4b-2060748f6966\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.144513 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e23d8-9b9b-4488-9f4b-2060748f6966-logs\") pod \"058e23d8-9b9b-4488-9f4b-2060748f6966\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.144655 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data-custom\") pod \"058e23d8-9b9b-4488-9f4b-2060748f6966\" (UID: \"058e23d8-9b9b-4488-9f4b-2060748f6966\") " Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.151603 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058e23d8-9b9b-4488-9f4b-2060748f6966-logs" (OuterVolumeSpecName: "logs") pod "058e23d8-9b9b-4488-9f4b-2060748f6966" (UID: "058e23d8-9b9b-4488-9f4b-2060748f6966"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.152050 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "058e23d8-9b9b-4488-9f4b-2060748f6966" (UID: "058e23d8-9b9b-4488-9f4b-2060748f6966"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.154901 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058e23d8-9b9b-4488-9f4b-2060748f6966-kube-api-access-vpnlh" (OuterVolumeSpecName: "kube-api-access-vpnlh") pod "058e23d8-9b9b-4488-9f4b-2060748f6966" (UID: "058e23d8-9b9b-4488-9f4b-2060748f6966"). InnerVolumeSpecName "kube-api-access-vpnlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.430258 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpnlh\" (UniqueName: \"kubernetes.io/projected/058e23d8-9b9b-4488-9f4b-2060748f6966-kube-api-access-vpnlh\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.430313 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e23d8-9b9b-4488-9f4b-2060748f6966-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.430326 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.437197 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "058e23d8-9b9b-4488-9f4b-2060748f6966" (UID: "058e23d8-9b9b-4488-9f4b-2060748f6966"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.564770 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data" (OuterVolumeSpecName: "config-data") pod "058e23d8-9b9b-4488-9f4b-2060748f6966" (UID: "058e23d8-9b9b-4488-9f4b-2060748f6966"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.568490 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.568523 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058e23d8-9b9b-4488-9f4b-2060748f6966-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.620072 4776 generic.go:334] "Generic (PLEG): container finished" podID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerID="b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116" exitCode=0 Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.620396 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5889fdcbc8-cklvv" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.622989 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5889fdcbc8-cklvv" event={"ID":"058e23d8-9b9b-4488-9f4b-2060748f6966","Type":"ContainerDied","Data":"b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116"} Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.623039 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5889fdcbc8-cklvv" event={"ID":"058e23d8-9b9b-4488-9f4b-2060748f6966","Type":"ContainerDied","Data":"af95ac553d74c5466a444280f97a7b367e061fb95380dd3db4e9972e8ed5b3de"} Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.623061 4776 scope.go:117] "RemoveContainer" containerID="b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.675432 4776 scope.go:117] "RemoveContainer" containerID="c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.690482 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5889fdcbc8-cklvv"] Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.699569 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5889fdcbc8-cklvv"] Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.743112 4776 scope.go:117] "RemoveContainer" containerID="b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116" Dec 04 10:01:05 crc kubenswrapper[4776]: E1204 10:01:05.744407 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116\": container with ID starting with b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116 not found: ID does not exist" containerID="b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.744457 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116"} err="failed to get container status \"b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116\": rpc error: code = NotFound desc = could not find container \"b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116\": container with ID starting with b4bbd6145fdcbc6d865f119fa4bf6ab72cefd0725027dd7dbef573c5b15b7116 not found: ID does not exist" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.744488 4776 scope.go:117] "RemoveContainer" containerID="c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4" Dec 04 10:01:05 crc kubenswrapper[4776]: E1204 10:01:05.744819 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4\": container with ID starting with c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4 not found: ID does not exist" containerID="c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.744865 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4"} err="failed to get container status \"c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4\": rpc error: code = NotFound desc = could not find container \"c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4\": container with ID starting with c902dcd5deab89cad42cbc097e54c72c6d07a37bef0f5af0346b3d4b813066b4 not found: ID does not exist" Dec 04 10:01:05 crc kubenswrapper[4776]: I1204 10:01:05.977963 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5779dfffd5-drdt5" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.061345 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-675d968b4d-rm7mc"] Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.061558 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-675d968b4d-rm7mc" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-api" containerID="cri-o://84233a5c27c5c03371efd40bf7d908ff17ce80861169afb025e56cd41ee3b25b" gracePeriod=30 Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.061663 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-675d968b4d-rm7mc" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-httpd" containerID="cri-o://c66bd142f0b1256bba10d9c5a058d40e0ea822795f39936d5a6646d14595770e" gracePeriod=30 Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.113159 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.181841 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-fernet-keys\") pod \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.181878 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4jfr\" (UniqueName: \"kubernetes.io/projected/9d261d84-4a7d-4b97-bffa-be0cae0c8102-kube-api-access-n4jfr\") pod \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.181941 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-combined-ca-bundle\") pod \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.182003 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-config-data\") pod \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\" (UID: \"9d261d84-4a7d-4b97-bffa-be0cae0c8102\") " Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.188340 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d261d84-4a7d-4b97-bffa-be0cae0c8102-kube-api-access-n4jfr" (OuterVolumeSpecName: "kube-api-access-n4jfr") pod "9d261d84-4a7d-4b97-bffa-be0cae0c8102" (UID: "9d261d84-4a7d-4b97-bffa-be0cae0c8102"). InnerVolumeSpecName "kube-api-access-n4jfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.196080 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9d261d84-4a7d-4b97-bffa-be0cae0c8102" (UID: "9d261d84-4a7d-4b97-bffa-be0cae0c8102"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.240234 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.244102 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d261d84-4a7d-4b97-bffa-be0cae0c8102" (UID: "9d261d84-4a7d-4b97-bffa-be0cae0c8102"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.285297 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.285341 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4jfr\" (UniqueName: \"kubernetes.io/projected/9d261d84-4a7d-4b97-bffa-be0cae0c8102-kube-api-access-n4jfr\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.285354 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.304200 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-config-data" (OuterVolumeSpecName: "config-data") pod "9d261d84-4a7d-4b97-bffa-be0cae0c8102" (UID: "9d261d84-4a7d-4b97-bffa-be0cae0c8102"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.391452 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d261d84-4a7d-4b97-bffa-be0cae0c8102-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.632364 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414041-vtpvx" event={"ID":"9d261d84-4a7d-4b97-bffa-be0cae0c8102","Type":"ContainerDied","Data":"2b353ab9750c629b3b9f2ecbaeaacb45ce6e7fe4b8a1bec9fb7643e231a77dfa"} Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.632659 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b353ab9750c629b3b9f2ecbaeaacb45ce6e7fe4b8a1bec9fb7643e231a77dfa" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.632612 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414041-vtpvx" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.636350 4776 generic.go:334] "Generic (PLEG): container finished" podID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerID="c66bd142f0b1256bba10d9c5a058d40e0ea822795f39936d5a6646d14595770e" exitCode=0 Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.636413 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675d968b4d-rm7mc" event={"ID":"f60afb42-ba20-437f-bbd8-f2c50de3e2d1","Type":"ContainerDied","Data":"c66bd142f0b1256bba10d9c5a058d40e0ea822795f39936d5a6646d14595770e"} Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.840908 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:01:06 crc kubenswrapper[4776]: I1204 10:01:06.846745 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d564c574b-x8jlb" Dec 04 10:01:07 crc kubenswrapper[4776]: I1204 10:01:07.476949 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" path="/var/lib/kubelet/pods/058e23d8-9b9b-4488-9f4b-2060748f6966/volumes" Dec 04 10:01:08 crc kubenswrapper[4776]: I1204 10:01:08.353588 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:01:09 crc kubenswrapper[4776]: I1204 10:01:09.705566 4776 generic.go:334] "Generic (PLEG): container finished" podID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerID="84233a5c27c5c03371efd40bf7d908ff17ce80861169afb025e56cd41ee3b25b" exitCode=0 Dec 04 10:01:09 crc kubenswrapper[4776]: I1204 10:01:09.706010 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675d968b4d-rm7mc" event={"ID":"f60afb42-ba20-437f-bbd8-f2c50de3e2d1","Type":"ContainerDied","Data":"84233a5c27c5c03371efd40bf7d908ff17ce80861169afb025e56cd41ee3b25b"} Dec 04 10:01:10 crc kubenswrapper[4776]: I1204 10:01:10.593851 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:10 crc kubenswrapper[4776]: I1204 10:01:10.594199 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-central-agent" containerID="cri-o://50c7c84857d456728e1600cd096e04ba168067e054852148ae76c2e5f23dea3e" gracePeriod=30 Dec 04 10:01:10 crc kubenswrapper[4776]: I1204 10:01:10.594289 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="proxy-httpd" containerID="cri-o://32f69bbdee2fdb20e5603c82c042d018b1f39eee0e7a111f6abfcefeb564b013" gracePeriod=30 Dec 04 10:01:10 crc kubenswrapper[4776]: I1204 10:01:10.594341 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-notification-agent" containerID="cri-o://27c8ed4d7484083473dc69f99f4342884d26a6c480b5ed536e99164920c5d362" gracePeriod=30 Dec 04 10:01:10 crc kubenswrapper[4776]: I1204 10:01:10.594440 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="sg-core" containerID="cri-o://528db00aae47b8937035bb07d6facb53b09b42e3bf0784e3fe161ce0fb3c9ae6" gracePeriod=30 Dec 04 10:01:12 crc kubenswrapper[4776]: I1204 10:01:12.046294 4776 generic.go:334] "Generic (PLEG): container finished" podID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerID="32f69bbdee2fdb20e5603c82c042d018b1f39eee0e7a111f6abfcefeb564b013" exitCode=0 Dec 04 10:01:12 crc kubenswrapper[4776]: I1204 10:01:12.046624 4776 generic.go:334] "Generic (PLEG): container finished" podID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerID="528db00aae47b8937035bb07d6facb53b09b42e3bf0784e3fe161ce0fb3c9ae6" exitCode=2 Dec 04 10:01:12 crc kubenswrapper[4776]: I1204 10:01:12.046633 4776 generic.go:334] "Generic (PLEG): container finished" podID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerID="50c7c84857d456728e1600cd096e04ba168067e054852148ae76c2e5f23dea3e" exitCode=0 Dec 04 10:01:12 crc kubenswrapper[4776]: I1204 10:01:12.046417 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerDied","Data":"32f69bbdee2fdb20e5603c82c042d018b1f39eee0e7a111f6abfcefeb564b013"} Dec 04 10:01:12 crc kubenswrapper[4776]: I1204 10:01:12.046671 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerDied","Data":"528db00aae47b8937035bb07d6facb53b09b42e3bf0784e3fe161ce0fb3c9ae6"} Dec 04 10:01:12 crc kubenswrapper[4776]: I1204 10:01:12.046686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerDied","Data":"50c7c84857d456728e1600cd096e04ba168067e054852148ae76c2e5f23dea3e"} Dec 04 10:01:15 crc kubenswrapper[4776]: E1204 10:01:15.913375 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 04 10:01:15 crc kubenswrapper[4776]: E1204 10:01:15.913879 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b4h5d4hf8hfbh665h68fh8h579h5cdh54h57fh67dh58dh565h5f5h8fh9h656h66chc4h5bfh65ch58fh5cfh584hf5h597h85h54h669h65ch55fq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgbkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(5d93526f-f97b-4a2a-98b4-4b880a99cbd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:01:15 crc kubenswrapper[4776]: E1204 10:01:15.915241 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="5d93526f-f97b-4a2a-98b4-4b880a99cbd7" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.125385 4776 generic.go:334] "Generic (PLEG): container finished" podID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerID="27c8ed4d7484083473dc69f99f4342884d26a6c480b5ed536e99164920c5d362" exitCode=0 Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.126280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerDied","Data":"27c8ed4d7484083473dc69f99f4342884d26a6c480b5ed536e99164920c5d362"} Dec 04 10:01:16 crc kubenswrapper[4776]: E1204 10:01:16.128161 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="5d93526f-f97b-4a2a-98b4-4b880a99cbd7" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.380521 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.394834 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-combined-ca-bundle\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.394959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-scripts\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.396856 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-log-httpd\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.396957 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-config-data\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.396992 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mm9z\" (UniqueName: \"kubernetes.io/projected/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-kube-api-access-8mm9z\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.397045 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-sg-core-conf-yaml\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.397110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-run-httpd\") pod \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\" (UID: \"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.399339 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.399604 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.407831 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-scripts" (OuterVolumeSpecName: "scripts") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.657961 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.657996 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.658008 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.658976 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-kube-api-access-8mm9z" (OuterVolumeSpecName: "kube-api-access-8mm9z") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "kube-api-access-8mm9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.679348 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.701648 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.707085 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.759691 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-ovndb-tls-certs\") pod \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.759820 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-combined-ca-bundle\") pod \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.759868 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsxzz\" (UniqueName: \"kubernetes.io/projected/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-kube-api-access-qsxzz\") pod \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.760327 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mm9z\" (UniqueName: \"kubernetes.io/projected/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-kube-api-access-8mm9z\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.760350 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.760363 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.764682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-kube-api-access-qsxzz" (OuterVolumeSpecName: "kube-api-access-qsxzz") pod "f60afb42-ba20-437f-bbd8-f2c50de3e2d1" (UID: "f60afb42-ba20-437f-bbd8-f2c50de3e2d1"). InnerVolumeSpecName "kube-api-access-qsxzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.784559 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-config-data" (OuterVolumeSpecName: "config-data") pod "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" (UID: "6d1f4eb3-1e4e-424a-95cc-6da3be46a54a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.813941 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f60afb42-ba20-437f-bbd8-f2c50de3e2d1" (UID: "f60afb42-ba20-437f-bbd8-f2c50de3e2d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.841999 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f60afb42-ba20-437f-bbd8-f2c50de3e2d1" (UID: "f60afb42-ba20-437f-bbd8-f2c50de3e2d1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.862707 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-httpd-config\") pod \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.862767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-config\") pod \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\" (UID: \"f60afb42-ba20-437f-bbd8-f2c50de3e2d1\") " Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.863438 4776 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.863461 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.863472 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsxzz\" (UniqueName: \"kubernetes.io/projected/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-kube-api-access-qsxzz\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.863484 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.866533 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f60afb42-ba20-437f-bbd8-f2c50de3e2d1" (UID: "f60afb42-ba20-437f-bbd8-f2c50de3e2d1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.920149 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-config" (OuterVolumeSpecName: "config") pod "f60afb42-ba20-437f-bbd8-f2c50de3e2d1" (UID: "f60afb42-ba20-437f-bbd8-f2c50de3e2d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.965736 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:16 crc kubenswrapper[4776]: I1204 10:01:16.965791 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60afb42-ba20-437f-bbd8-f2c50de3e2d1-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.186114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-675d968b4d-rm7mc" event={"ID":"f60afb42-ba20-437f-bbd8-f2c50de3e2d1","Type":"ContainerDied","Data":"be58cee1d66c0577a9c1dd0bd7587de2ed3149aa78b1f36dc44c97e58cdd6f7d"} Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.186431 4776 scope.go:117] "RemoveContainer" containerID="c66bd142f0b1256bba10d9c5a058d40e0ea822795f39936d5a6646d14595770e" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.186555 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-675d968b4d-rm7mc" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.231317 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d1f4eb3-1e4e-424a-95cc-6da3be46a54a","Type":"ContainerDied","Data":"8ce808209a6a415fee28f7549f241c6b343b42d31c18905219c2307e69735f4f"} Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.231728 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.252718 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-675d968b4d-rm7mc"] Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.287646 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-675d968b4d-rm7mc"] Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.289177 4776 scope.go:117] "RemoveContainer" containerID="84233a5c27c5c03371efd40bf7d908ff17ce80861169afb025e56cd41ee3b25b" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.343984 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.387987 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.420976 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.426095 4776 scope.go:117] "RemoveContainer" containerID="32f69bbdee2fdb20e5603c82c042d018b1f39eee0e7a111f6abfcefeb564b013" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.432978 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api-log" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433025 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api-log" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433041 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-httpd" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433047 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-httpd" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433059 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-notification-agent" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433066 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-notification-agent" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433082 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d261d84-4a7d-4b97-bffa-be0cae0c8102" containerName="keystone-cron" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433088 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d261d84-4a7d-4b97-bffa-be0cae0c8102" containerName="keystone-cron" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433105 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433112 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433126 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-api" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433131 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-api" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433154 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="sg-core" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433159 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="sg-core" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433170 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-central-agent" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433177 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-central-agent" Dec 04 10:01:17 crc kubenswrapper[4776]: E1204 10:01:17.433197 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="proxy-httpd" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433203 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="proxy-httpd" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433490 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-notification-agent" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433500 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="ceilometer-central-agent" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433506 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="sg-core" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433517 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-api" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433529 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" containerName="neutron-httpd" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433539 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" containerName="proxy-httpd" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433546 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433557 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="058e23d8-9b9b-4488-9f4b-2060748f6966" containerName="barbican-api-log" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.433566 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d261d84-4a7d-4b97-bffa-be0cae0c8102" containerName="keystone-cron" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.435239 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.443584 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.443881 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.478617 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d1f4eb3-1e4e-424a-95cc-6da3be46a54a" path="/var/lib/kubelet/pods/6d1f4eb3-1e4e-424a-95cc-6da3be46a54a/volumes" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.480388 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f60afb42-ba20-437f-bbd8-f2c50de3e2d1" path="/var/lib/kubelet/pods/f60afb42-ba20-437f-bbd8-f2c50de3e2d1/volumes" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.481122 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.495276 4776 scope.go:117] "RemoveContainer" containerID="528db00aae47b8937035bb07d6facb53b09b42e3bf0784e3fe161ce0fb3c9ae6" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.533771 4776 scope.go:117] "RemoveContainer" containerID="27c8ed4d7484083473dc69f99f4342884d26a6c480b5ed536e99164920c5d362" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560275 4776 scope.go:117] "RemoveContainer" containerID="50c7c84857d456728e1600cd096e04ba168067e054852148ae76c2e5f23dea3e" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560610 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-scripts\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560755 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrpsb\" (UniqueName: \"kubernetes.io/projected/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-kube-api-access-lrpsb\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560803 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-log-httpd\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560835 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560862 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-run-httpd\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.560992 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.561074 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-config-data\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662463 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-scripts\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662588 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrpsb\" (UniqueName: \"kubernetes.io/projected/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-kube-api-access-lrpsb\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662621 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-log-httpd\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662673 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-run-httpd\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662699 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.662739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-config-data\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.663349 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-log-httpd\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.663449 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-run-httpd\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.667022 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.667680 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-config-data\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.668379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.670104 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-scripts\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:17 crc kubenswrapper[4776]: I1204 10:01:17.681592 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrpsb\" (UniqueName: \"kubernetes.io/projected/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-kube-api-access-lrpsb\") pod \"ceilometer-0\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " pod="openstack/ceilometer-0" Dec 04 10:01:18 crc kubenswrapper[4776]: I1204 10:01:18.252333 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:18 crc kubenswrapper[4776]: I1204 10:01:18.263167 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.151:8776/healthcheck\": dial tcp 10.217.0.151:8776: connect: connection refused" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.144893 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.294029 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.319182 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerStarted","Data":"8e25f8afec68d940f4f7818609036d6d085a27f1dab934a57b53e42749d41e33"} Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.322766 4776 generic.go:334] "Generic (PLEG): container finished" podID="627d090b-c706-469f-9370-f06c1a9d7e89" containerID="3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a" exitCode=137 Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.322818 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"627d090b-c706-469f-9370-f06c1a9d7e89","Type":"ContainerDied","Data":"3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a"} Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.322838 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.322861 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"627d090b-c706-469f-9370-f06c1a9d7e89","Type":"ContainerDied","Data":"0b7c250048dfd4a24dfb3e670db7323ad31583df16409e4ec7dd9b74fddfe271"} Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.322881 4776 scope.go:117] "RemoveContainer" containerID="3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.522451 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data-custom\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.522683 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/627d090b-c706-469f-9370-f06c1a9d7e89-etc-machine-id\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.522824 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-scripts\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.522959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627d090b-c706-469f-9370-f06c1a9d7e89-logs\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.523013 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-combined-ca-bundle\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.523157 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64k8z\" (UniqueName: \"kubernetes.io/projected/627d090b-c706-469f-9370-f06c1a9d7e89-kube-api-access-64k8z\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.523250 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data\") pod \"627d090b-c706-469f-9370-f06c1a9d7e89\" (UID: \"627d090b-c706-469f-9370-f06c1a9d7e89\") " Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.528015 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.528074 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.535230 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-scripts" (OuterVolumeSpecName: "scripts") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.535697 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/627d090b-c706-469f-9370-f06c1a9d7e89-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.540471 4776 scope.go:117] "RemoveContainer" containerID="874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.547097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627d090b-c706-469f-9370-f06c1a9d7e89-kube-api-access-64k8z" (OuterVolumeSpecName: "kube-api-access-64k8z") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "kube-api-access-64k8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.556842 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627d090b-c706-469f-9370-f06c1a9d7e89-logs" (OuterVolumeSpecName: "logs") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.560027 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.608235 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data" (OuterVolumeSpecName: "config-data") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.618003 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "627d090b-c706-469f-9370-f06c1a9d7e89" (UID: "627d090b-c706-469f-9370-f06c1a9d7e89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.625777 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.625886 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64k8z\" (UniqueName: \"kubernetes.io/projected/627d090b-c706-469f-9370-f06c1a9d7e89-kube-api-access-64k8z\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.625941 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.626006 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.626046 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/627d090b-c706-469f-9370-f06c1a9d7e89-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.626105 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627d090b-c706-469f-9370-f06c1a9d7e89-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.626127 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627d090b-c706-469f-9370-f06c1a9d7e89-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.667773 4776 scope.go:117] "RemoveContainer" containerID="3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.672767 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:01:19 crc kubenswrapper[4776]: E1204 10:01:19.675701 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a\": container with ID starting with 3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a not found: ID does not exist" containerID="3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.675753 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a"} err="failed to get container status \"3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a\": rpc error: code = NotFound desc = could not find container \"3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a\": container with ID starting with 3cc1b18e216b741b511737efe8c0aa37f10f020bd37732e2d3d41c6610c4392a not found: ID does not exist" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.675782 4776 scope.go:117] "RemoveContainer" containerID="874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77" Dec 04 10:01:19 crc kubenswrapper[4776]: E1204 10:01:19.677009 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77\": container with ID starting with 874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77 not found: ID does not exist" containerID="874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.677037 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77"} err="failed to get container status \"874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77\": rpc error: code = NotFound desc = could not find container \"874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77\": container with ID starting with 874ce1100ca1e2a3e16f1930772e05aeb4fae0ae22ac06c5d830527c35c83f77 not found: ID does not exist" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.696833 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.729066 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:01:19 crc kubenswrapper[4776]: E1204 10:01:19.729998 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api-log" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.730018 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api-log" Dec 04 10:01:19 crc kubenswrapper[4776]: E1204 10:01:19.730082 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.730107 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.730731 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api-log" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.730763 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" containerName="cinder-api" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.735350 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.753558 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.760715 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761031 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761127 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-scripts\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761199 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-config-data-custom\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761618 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcpht\" (UniqueName: \"kubernetes.io/projected/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-kube-api-access-hcpht\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761713 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-logs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.761775 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-config-data\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.762035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.765214 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.765462 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.805051 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863423 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-config-data\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863494 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863548 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863595 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863629 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-scripts\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863649 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-config-data-custom\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863770 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcpht\" (UniqueName: \"kubernetes.io/projected/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-kube-api-access-hcpht\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.863799 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-logs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.864246 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-logs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.864420 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.875495 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-scripts\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.878563 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.879960 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.880462 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-config-data\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.880936 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-config-data-custom\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.882024 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:19 crc kubenswrapper[4776]: I1204 10:01:19.882466 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcpht\" (UniqueName: \"kubernetes.io/projected/4b7e0c9e-6f33-42f0-af0a-0ec740ba7206-kube-api-access-hcpht\") pod \"cinder-api-0\" (UID: \"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206\") " pod="openstack/cinder-api-0" Dec 04 10:01:20 crc kubenswrapper[4776]: I1204 10:01:20.253489 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 10:01:20 crc kubenswrapper[4776]: I1204 10:01:20.378060 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerStarted","Data":"acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9"} Dec 04 10:01:20 crc kubenswrapper[4776]: I1204 10:01:20.535868 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:20 crc kubenswrapper[4776]: I1204 10:01:20.636197 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 10:01:21 crc kubenswrapper[4776]: I1204 10:01:21.394485 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206","Type":"ContainerStarted","Data":"e363773fdb421e7f303eec4da776fa903ce0533624fe8c84916aa7704a08df63"} Dec 04 10:01:21 crc kubenswrapper[4776]: I1204 10:01:21.394838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206","Type":"ContainerStarted","Data":"0b178467f1697445c372df4714ca5223551681ec14a487224f928908d11a02de"} Dec 04 10:01:21 crc kubenswrapper[4776]: I1204 10:01:21.396819 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerStarted","Data":"3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab"} Dec 04 10:01:21 crc kubenswrapper[4776]: I1204 10:01:21.617843 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627d090b-c706-469f-9370-f06c1a9d7e89" path="/var/lib/kubelet/pods/627d090b-c706-469f-9370-f06c1a9d7e89/volumes" Dec 04 10:01:22 crc kubenswrapper[4776]: I1204 10:01:22.461384 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerStarted","Data":"f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4"} Dec 04 10:01:23 crc kubenswrapper[4776]: I1204 10:01:23.482176 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4b7e0c9e-6f33-42f0-af0a-0ec740ba7206","Type":"ContainerStarted","Data":"18b7cbd55994017b105596ae2c2973bbbee3dc400dede4f296240235d604d56c"} Dec 04 10:01:23 crc kubenswrapper[4776]: I1204 10:01:23.482532 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 10:01:23 crc kubenswrapper[4776]: I1204 10:01:23.516632 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.515175481 podStartE2EDuration="4.515175481s" podCreationTimestamp="2025-12-04 10:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:23.502598666 +0000 UTC m=+1328.369079033" watchObservedRunningTime="2025-12-04 10:01:23.515175481 +0000 UTC m=+1328.381655858" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.495559 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerStarted","Data":"502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6"} Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.496991 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.495686 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-central-agent" containerID="cri-o://acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9" gracePeriod=30 Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.495964 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="sg-core" containerID="cri-o://f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4" gracePeriod=30 Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.495990 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-notification-agent" containerID="cri-o://3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab" gracePeriod=30 Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.495897 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="proxy-httpd" containerID="cri-o://502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6" gracePeriod=30 Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.529004 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-k7pq8"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.534850 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.583599 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-k7pq8"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.588538 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.174600227 podStartE2EDuration="7.588505531s" podCreationTimestamp="2025-12-04 10:01:17 +0000 UTC" firstStartedPulling="2025-12-04 10:01:19.120341399 +0000 UTC m=+1323.986821776" lastFinishedPulling="2025-12-04 10:01:23.534246713 +0000 UTC m=+1328.400727080" observedRunningTime="2025-12-04 10:01:24.559606721 +0000 UTC m=+1329.426087108" watchObservedRunningTime="2025-12-04 10:01:24.588505531 +0000 UTC m=+1329.454985918" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.647765 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-n6cjq"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.649244 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.664950 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n6cjq"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.666003 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/645e3327-5337-40f6-b730-817d497cf5b8-operator-scripts\") pod \"nova-api-db-create-k7pq8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.666062 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jdw\" (UniqueName: \"kubernetes.io/projected/645e3327-5337-40f6-b730-817d497cf5b8-kube-api-access-w2jdw\") pod \"nova-api-db-create-k7pq8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.685640 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cda9-account-create-update-mhwlb"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.687162 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.693198 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.702531 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cda9-account-create-update-mhwlb"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.767308 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/645e3327-5337-40f6-b730-817d497cf5b8-operator-scripts\") pod \"nova-api-db-create-k7pq8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.767368 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1556365b-47ac-4dc6-995c-60236a99c4cc-operator-scripts\") pod \"nova-cell0-db-create-n6cjq\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.767408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jdw\" (UniqueName: \"kubernetes.io/projected/645e3327-5337-40f6-b730-817d497cf5b8-kube-api-access-w2jdw\") pod \"nova-api-db-create-k7pq8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.767435 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2nq\" (UniqueName: \"kubernetes.io/projected/1556365b-47ac-4dc6-995c-60236a99c4cc-kube-api-access-tl2nq\") pod \"nova-cell0-db-create-n6cjq\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.768294 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/645e3327-5337-40f6-b730-817d497cf5b8-operator-scripts\") pod \"nova-api-db-create-k7pq8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.807778 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-858g8"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.809114 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.820622 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jdw\" (UniqueName: \"kubernetes.io/projected/645e3327-5337-40f6-b730-817d497cf5b8-kube-api-access-w2jdw\") pod \"nova-api-db-create-k7pq8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.826933 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7211-account-create-update-v84w9"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.828155 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.830420 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.837718 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-858g8"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.848781 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7211-account-create-update-v84w9"] Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.875499 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.876527 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57e217c-6f7f-4ccc-9083-db0620a54c8d-operator-scripts\") pod \"nova-api-cda9-account-create-update-mhwlb\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.876630 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1556365b-47ac-4dc6-995c-60236a99c4cc-operator-scripts\") pod \"nova-cell0-db-create-n6cjq\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.876685 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdzzj\" (UniqueName: \"kubernetes.io/projected/d57e217c-6f7f-4ccc-9083-db0620a54c8d-kube-api-access-jdzzj\") pod \"nova-api-cda9-account-create-update-mhwlb\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.876784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2nq\" (UniqueName: \"kubernetes.io/projected/1556365b-47ac-4dc6-995c-60236a99c4cc-kube-api-access-tl2nq\") pod \"nova-cell0-db-create-n6cjq\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:24 crc kubenswrapper[4776]: I1204 10:01:24.878355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1556365b-47ac-4dc6-995c-60236a99c4cc-operator-scripts\") pod \"nova-cell0-db-create-n6cjq\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.063746 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747fc25a-59ce-428f-8459-6180355f4629-operator-scripts\") pod \"nova-cell1-db-create-858g8\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.063832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732e1a2d-02d2-4754-9489-6bd42ba248e8-operator-scripts\") pod \"nova-cell0-7211-account-create-update-v84w9\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.063898 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbj4b\" (UniqueName: \"kubernetes.io/projected/732e1a2d-02d2-4754-9489-6bd42ba248e8-kube-api-access-tbj4b\") pod \"nova-cell0-7211-account-create-update-v84w9\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.063966 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57e217c-6f7f-4ccc-9083-db0620a54c8d-operator-scripts\") pod \"nova-api-cda9-account-create-update-mhwlb\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.063993 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lfhc\" (UniqueName: \"kubernetes.io/projected/747fc25a-59ce-428f-8459-6180355f4629-kube-api-access-7lfhc\") pod \"nova-cell1-db-create-858g8\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.064038 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdzzj\" (UniqueName: \"kubernetes.io/projected/d57e217c-6f7f-4ccc-9083-db0620a54c8d-kube-api-access-jdzzj\") pod \"nova-api-cda9-account-create-update-mhwlb\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.067009 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57e217c-6f7f-4ccc-9083-db0620a54c8d-operator-scripts\") pod \"nova-api-cda9-account-create-update-mhwlb\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.096296 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2nq\" (UniqueName: \"kubernetes.io/projected/1556365b-47ac-4dc6-995c-60236a99c4cc-kube-api-access-tl2nq\") pod \"nova-cell0-db-create-n6cjq\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.096811 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdzzj\" (UniqueName: \"kubernetes.io/projected/d57e217c-6f7f-4ccc-9083-db0620a54c8d-kube-api-access-jdzzj\") pod \"nova-api-cda9-account-create-update-mhwlb\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.138985 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7d33-account-create-update-j4l8p"] Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.140176 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.143550 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.150393 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7d33-account-create-update-j4l8p"] Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.165577 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lfhc\" (UniqueName: \"kubernetes.io/projected/747fc25a-59ce-428f-8459-6180355f4629-kube-api-access-7lfhc\") pod \"nova-cell1-db-create-858g8\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.165933 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747fc25a-59ce-428f-8459-6180355f4629-operator-scripts\") pod \"nova-cell1-db-create-858g8\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.166058 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732e1a2d-02d2-4754-9489-6bd42ba248e8-operator-scripts\") pod \"nova-cell0-7211-account-create-update-v84w9\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.166677 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbj4b\" (UniqueName: \"kubernetes.io/projected/732e1a2d-02d2-4754-9489-6bd42ba248e8-kube-api-access-tbj4b\") pod \"nova-cell0-7211-account-create-update-v84w9\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.167230 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747fc25a-59ce-428f-8459-6180355f4629-operator-scripts\") pod \"nova-cell1-db-create-858g8\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.168008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732e1a2d-02d2-4754-9489-6bd42ba248e8-operator-scripts\") pod \"nova-cell0-7211-account-create-update-v84w9\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.189329 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbj4b\" (UniqueName: \"kubernetes.io/projected/732e1a2d-02d2-4754-9489-6bd42ba248e8-kube-api-access-tbj4b\") pod \"nova-cell0-7211-account-create-update-v84w9\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.190439 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lfhc\" (UniqueName: \"kubernetes.io/projected/747fc25a-59ce-428f-8459-6180355f4629-kube-api-access-7lfhc\") pod \"nova-cell1-db-create-858g8\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.236319 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.269301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-operator-scripts\") pod \"nova-cell1-7d33-account-create-update-j4l8p\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.269428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpctf\" (UniqueName: \"kubernetes.io/projected/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-kube-api-access-bpctf\") pod \"nova-cell1-7d33-account-create-update-j4l8p\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.282223 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.420769 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.420959 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.424463 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-operator-scripts\") pod \"nova-cell1-7d33-account-create-update-j4l8p\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.424649 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpctf\" (UniqueName: \"kubernetes.io/projected/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-kube-api-access-bpctf\") pod \"nova-cell1-7d33-account-create-update-j4l8p\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.426340 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-operator-scripts\") pod \"nova-cell1-7d33-account-create-update-j4l8p\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.445841 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpctf\" (UniqueName: \"kubernetes.io/projected/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-kube-api-access-bpctf\") pod \"nova-cell1-7d33-account-create-update-j4l8p\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.541988 4776 generic.go:334] "Generic (PLEG): container finished" podID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerID="502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6" exitCode=0 Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.542028 4776 generic.go:334] "Generic (PLEG): container finished" podID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerID="f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4" exitCode=2 Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.542041 4776 generic.go:334] "Generic (PLEG): container finished" podID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerID="3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab" exitCode=0 Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.542067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerDied","Data":"502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6"} Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.542100 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerDied","Data":"f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4"} Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.542114 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerDied","Data":"3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab"} Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.560727 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:25 crc kubenswrapper[4776]: E1204 10:01:25.710036 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73bc5f13_b110_4ad4_a830_ed9d5e6d9800.slice/crio-502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73bc5f13_b110_4ad4_a830_ed9d5e6d9800.slice/crio-conmon-502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73bc5f13_b110_4ad4_a830_ed9d5e6d9800.slice/crio-conmon-3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:01:25 crc kubenswrapper[4776]: I1204 10:01:25.749253 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-k7pq8"] Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.121360 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7211-account-create-update-v84w9"] Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.391676 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n6cjq"] Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.560510 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k7pq8" event={"ID":"645e3327-5337-40f6-b730-817d497cf5b8","Type":"ContainerStarted","Data":"3d985c2f5e44fcc35110063a7f39dfaf8028f5883eeb56b8b63aca84828480a4"} Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.560556 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k7pq8" event={"ID":"645e3327-5337-40f6-b730-817d497cf5b8","Type":"ContainerStarted","Data":"d2f3f0b76b5c35803640be762599e107c0e0221257e2a6e9c19d4340ff847e96"} Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.576107 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7211-account-create-update-v84w9" event={"ID":"732e1a2d-02d2-4754-9489-6bd42ba248e8","Type":"ContainerStarted","Data":"9c8ffff73df9a5eb2b923bf7652fbd57633ac3f23fd574b99637dbd86d54ae86"} Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.576145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7211-account-create-update-v84w9" event={"ID":"732e1a2d-02d2-4754-9489-6bd42ba248e8","Type":"ContainerStarted","Data":"7329ded299d2c7bfc4257bc0857539c2a874a5a81ea669ea3969d3f3086fc413"} Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.581589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n6cjq" event={"ID":"1556365b-47ac-4dc6-995c-60236a99c4cc","Type":"ContainerStarted","Data":"eda7e4601f164d961b0ae9089dbd8ed3e0fe3465b08627304755d066c62cd60e"} Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.601585 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-k7pq8" podStartSLOduration=2.601563024 podStartE2EDuration="2.601563024s" podCreationTimestamp="2025-12-04 10:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:26.579113337 +0000 UTC m=+1331.445593714" watchObservedRunningTime="2025-12-04 10:01:26.601563024 +0000 UTC m=+1331.468043401" Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.639036 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-7211-account-create-update-v84w9" podStartSLOduration=2.6388909590000003 podStartE2EDuration="2.638890959s" podCreationTimestamp="2025-12-04 10:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:26.59636701 +0000 UTC m=+1331.462847397" watchObservedRunningTime="2025-12-04 10:01:26.638890959 +0000 UTC m=+1331.505371336" Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.847727 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-858g8"] Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.911448 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7d33-account-create-update-j4l8p"] Dec 04 10:01:26 crc kubenswrapper[4776]: I1204 10:01:26.920415 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cda9-account-create-update-mhwlb"] Dec 04 10:01:26 crc kubenswrapper[4776]: W1204 10:01:26.920899 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd57e217c_6f7f_4ccc_9083_db0620a54c8d.slice/crio-3029d93b0750925b382b5cbf5c7aa16bba5f4b14165c511d569d34c445bb008d WatchSource:0}: Error finding container 3029d93b0750925b382b5cbf5c7aa16bba5f4b14165c511d569d34c445bb008d: Status 404 returned error can't find the container with id 3029d93b0750925b382b5cbf5c7aa16bba5f4b14165c511d569d34c445bb008d Dec 04 10:01:26 crc kubenswrapper[4776]: W1204 10:01:26.931127 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41998dcd_34d5_4335_a5fa_8e6ceb8aac4c.slice/crio-af91bca4d7cc213b051eb95e8ca3d46d6978ccce283787f7f6f5301d80ade45e WatchSource:0}: Error finding container af91bca4d7cc213b051eb95e8ca3d46d6978ccce283787f7f6f5301d80ade45e: Status 404 returned error can't find the container with id af91bca4d7cc213b051eb95e8ca3d46d6978ccce283787f7f6f5301d80ade45e Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.592079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda9-account-create-update-mhwlb" event={"ID":"d57e217c-6f7f-4ccc-9083-db0620a54c8d","Type":"ContainerStarted","Data":"9d06582bd9a4bc299fa642414be26627c5b4cff36919a19976ea357175b4e8f6"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.592422 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda9-account-create-update-mhwlb" event={"ID":"d57e217c-6f7f-4ccc-9083-db0620a54c8d","Type":"ContainerStarted","Data":"3029d93b0750925b382b5cbf5c7aa16bba5f4b14165c511d569d34c445bb008d"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.595283 4776 generic.go:334] "Generic (PLEG): container finished" podID="645e3327-5337-40f6-b730-817d497cf5b8" containerID="3d985c2f5e44fcc35110063a7f39dfaf8028f5883eeb56b8b63aca84828480a4" exitCode=0 Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.595336 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k7pq8" event={"ID":"645e3327-5337-40f6-b730-817d497cf5b8","Type":"ContainerDied","Data":"3d985c2f5e44fcc35110063a7f39dfaf8028f5883eeb56b8b63aca84828480a4"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.597638 4776 generic.go:334] "Generic (PLEG): container finished" podID="732e1a2d-02d2-4754-9489-6bd42ba248e8" containerID="9c8ffff73df9a5eb2b923bf7652fbd57633ac3f23fd574b99637dbd86d54ae86" exitCode=0 Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.597691 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7211-account-create-update-v84w9" event={"ID":"732e1a2d-02d2-4754-9489-6bd42ba248e8","Type":"ContainerDied","Data":"9c8ffff73df9a5eb2b923bf7652fbd57633ac3f23fd574b99637dbd86d54ae86"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.605997 4776 generic.go:334] "Generic (PLEG): container finished" podID="1556365b-47ac-4dc6-995c-60236a99c4cc" containerID="218ecda0a844e63a1bc441cc11be3bac531cabb28bd73bce33ce24ac587a079b" exitCode=0 Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.606206 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n6cjq" event={"ID":"1556365b-47ac-4dc6-995c-60236a99c4cc","Type":"ContainerDied","Data":"218ecda0a844e63a1bc441cc11be3bac531cabb28bd73bce33ce24ac587a079b"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.608655 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-858g8" event={"ID":"747fc25a-59ce-428f-8459-6180355f4629","Type":"ContainerStarted","Data":"e9c408e16347b3639fbae337de308b7d6acfb1b7325d74cd39225fe260d22809"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.608711 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-858g8" event={"ID":"747fc25a-59ce-428f-8459-6180355f4629","Type":"ContainerStarted","Data":"36c8098d2b4105b5982d38fa1e04b0e04fd2121cf98fd1efd317f7e7e3b3f4a0"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.610564 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" event={"ID":"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c","Type":"ContainerStarted","Data":"03b7e9728273558121b71c4da2ebfe7ed7cfd1eb9b9deaf34148432e1c81bb73"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.610614 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" event={"ID":"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c","Type":"ContainerStarted","Data":"af91bca4d7cc213b051eb95e8ca3d46d6978ccce283787f7f6f5301d80ade45e"} Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.631001 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-cda9-account-create-update-mhwlb" podStartSLOduration=3.63097768 podStartE2EDuration="3.63097768s" podCreationTimestamp="2025-12-04 10:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:27.618149277 +0000 UTC m=+1332.484629654" watchObservedRunningTime="2025-12-04 10:01:27.63097768 +0000 UTC m=+1332.497458057" Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.663601 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" podStartSLOduration=2.663576627 podStartE2EDuration="2.663576627s" podCreationTimestamp="2025-12-04 10:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:27.652632723 +0000 UTC m=+1332.519113120" watchObservedRunningTime="2025-12-04 10:01:27.663576627 +0000 UTC m=+1332.530057004" Dec 04 10:01:27 crc kubenswrapper[4776]: I1204 10:01:27.698722 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-858g8" podStartSLOduration=3.698704783 podStartE2EDuration="3.698704783s" podCreationTimestamp="2025-12-04 10:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:01:27.675392278 +0000 UTC m=+1332.541872655" watchObservedRunningTime="2025-12-04 10:01:27.698704783 +0000 UTC m=+1332.565185160" Dec 04 10:01:28 crc kubenswrapper[4776]: I1204 10:01:28.622052 4776 generic.go:334] "Generic (PLEG): container finished" podID="747fc25a-59ce-428f-8459-6180355f4629" containerID="e9c408e16347b3639fbae337de308b7d6acfb1b7325d74cd39225fe260d22809" exitCode=0 Dec 04 10:01:28 crc kubenswrapper[4776]: I1204 10:01:28.622145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-858g8" event={"ID":"747fc25a-59ce-428f-8459-6180355f4629","Type":"ContainerDied","Data":"e9c408e16347b3639fbae337de308b7d6acfb1b7325d74cd39225fe260d22809"} Dec 04 10:01:28 crc kubenswrapper[4776]: I1204 10:01:28.624122 4776 generic.go:334] "Generic (PLEG): container finished" podID="41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" containerID="03b7e9728273558121b71c4da2ebfe7ed7cfd1eb9b9deaf34148432e1c81bb73" exitCode=0 Dec 04 10:01:28 crc kubenswrapper[4776]: I1204 10:01:28.624215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" event={"ID":"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c","Type":"ContainerDied","Data":"03b7e9728273558121b71c4da2ebfe7ed7cfd1eb9b9deaf34148432e1c81bb73"} Dec 04 10:01:28 crc kubenswrapper[4776]: I1204 10:01:28.625844 4776 generic.go:334] "Generic (PLEG): container finished" podID="d57e217c-6f7f-4ccc-9083-db0620a54c8d" containerID="9d06582bd9a4bc299fa642414be26627c5b4cff36919a19976ea357175b4e8f6" exitCode=0 Dec 04 10:01:28 crc kubenswrapper[4776]: I1204 10:01:28.625882 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda9-account-create-update-mhwlb" event={"ID":"d57e217c-6f7f-4ccc-9083-db0620a54c8d","Type":"ContainerDied","Data":"9d06582bd9a4bc299fa642414be26627c5b4cff36919a19976ea357175b4e8f6"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.312146 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.338358 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.353396 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.430732 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/645e3327-5337-40f6-b730-817d497cf5b8-operator-scripts\") pod \"645e3327-5337-40f6-b730-817d497cf5b8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.431217 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2jdw\" (UniqueName: \"kubernetes.io/projected/645e3327-5337-40f6-b730-817d497cf5b8-kube-api-access-w2jdw\") pod \"645e3327-5337-40f6-b730-817d497cf5b8\" (UID: \"645e3327-5337-40f6-b730-817d497cf5b8\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.431276 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732e1a2d-02d2-4754-9489-6bd42ba248e8-operator-scripts\") pod \"732e1a2d-02d2-4754-9489-6bd42ba248e8\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.431325 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1556365b-47ac-4dc6-995c-60236a99c4cc-operator-scripts\") pod \"1556365b-47ac-4dc6-995c-60236a99c4cc\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.431435 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2nq\" (UniqueName: \"kubernetes.io/projected/1556365b-47ac-4dc6-995c-60236a99c4cc-kube-api-access-tl2nq\") pod \"1556365b-47ac-4dc6-995c-60236a99c4cc\" (UID: \"1556365b-47ac-4dc6-995c-60236a99c4cc\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.431470 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbj4b\" (UniqueName: \"kubernetes.io/projected/732e1a2d-02d2-4754-9489-6bd42ba248e8-kube-api-access-tbj4b\") pod \"732e1a2d-02d2-4754-9489-6bd42ba248e8\" (UID: \"732e1a2d-02d2-4754-9489-6bd42ba248e8\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.432195 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732e1a2d-02d2-4754-9489-6bd42ba248e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "732e1a2d-02d2-4754-9489-6bd42ba248e8" (UID: "732e1a2d-02d2-4754-9489-6bd42ba248e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.432306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1556365b-47ac-4dc6-995c-60236a99c4cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1556365b-47ac-4dc6-995c-60236a99c4cc" (UID: "1556365b-47ac-4dc6-995c-60236a99c4cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.432502 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/645e3327-5337-40f6-b730-817d497cf5b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "645e3327-5337-40f6-b730-817d497cf5b8" (UID: "645e3327-5337-40f6-b730-817d497cf5b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.437528 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1556365b-47ac-4dc6-995c-60236a99c4cc-kube-api-access-tl2nq" (OuterVolumeSpecName: "kube-api-access-tl2nq") pod "1556365b-47ac-4dc6-995c-60236a99c4cc" (UID: "1556365b-47ac-4dc6-995c-60236a99c4cc"). InnerVolumeSpecName "kube-api-access-tl2nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.438634 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645e3327-5337-40f6-b730-817d497cf5b8-kube-api-access-w2jdw" (OuterVolumeSpecName: "kube-api-access-w2jdw") pod "645e3327-5337-40f6-b730-817d497cf5b8" (UID: "645e3327-5337-40f6-b730-817d497cf5b8"). InnerVolumeSpecName "kube-api-access-w2jdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.440718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732e1a2d-02d2-4754-9489-6bd42ba248e8-kube-api-access-tbj4b" (OuterVolumeSpecName: "kube-api-access-tbj4b") pod "732e1a2d-02d2-4754-9489-6bd42ba248e8" (UID: "732e1a2d-02d2-4754-9489-6bd42ba248e8"). InnerVolumeSpecName "kube-api-access-tbj4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.489270 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.533165 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-run-httpd\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.533253 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrpsb\" (UniqueName: \"kubernetes.io/projected/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-kube-api-access-lrpsb\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.533596 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.533892 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-sg-core-conf-yaml\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.534487 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-scripts\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.534546 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-log-httpd\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.534605 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-combined-ca-bundle\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.534653 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-config-data\") pod \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\" (UID: \"73bc5f13-b110-4ad4-a830-ed9d5e6d9800\") " Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535544 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/645e3327-5337-40f6-b730-817d497cf5b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535566 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535576 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2jdw\" (UniqueName: \"kubernetes.io/projected/645e3327-5337-40f6-b730-817d497cf5b8-kube-api-access-w2jdw\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535586 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732e1a2d-02d2-4754-9489-6bd42ba248e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535596 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1556365b-47ac-4dc6-995c-60236a99c4cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535607 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2nq\" (UniqueName: \"kubernetes.io/projected/1556365b-47ac-4dc6-995c-60236a99c4cc-kube-api-access-tl2nq\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.535615 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbj4b\" (UniqueName: \"kubernetes.io/projected/732e1a2d-02d2-4754-9489-6bd42ba248e8-kube-api-access-tbj4b\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.537872 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-kube-api-access-lrpsb" (OuterVolumeSpecName: "kube-api-access-lrpsb") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "kube-api-access-lrpsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.538002 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.538958 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-scripts" (OuterVolumeSpecName: "scripts") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.562186 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.622692 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.637951 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.638185 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.638195 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.638205 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.638217 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrpsb\" (UniqueName: \"kubernetes.io/projected/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-kube-api-access-lrpsb\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.645198 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7211-account-create-update-v84w9" event={"ID":"732e1a2d-02d2-4754-9489-6bd42ba248e8","Type":"ContainerDied","Data":"7329ded299d2c7bfc4257bc0857539c2a874a5a81ea669ea3969d3f3086fc413"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.645231 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7211-account-create-update-v84w9" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.645275 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7329ded299d2c7bfc4257bc0857539c2a874a5a81ea669ea3969d3f3086fc413" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.650097 4776 generic.go:334] "Generic (PLEG): container finished" podID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerID="acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9" exitCode=0 Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.650178 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerDied","Data":"acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.650208 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73bc5f13-b110-4ad4-a830-ed9d5e6d9800","Type":"ContainerDied","Data":"8e25f8afec68d940f4f7818609036d6d085a27f1dab934a57b53e42749d41e33"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.650228 4776 scope.go:117] "RemoveContainer" containerID="502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.650481 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.654363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d93526f-f97b-4a2a-98b4-4b880a99cbd7","Type":"ContainerStarted","Data":"c696da323aec948d76e07debf48b4cfc2c2b21611512d155fb4d404f4bdef362"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.658439 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n6cjq" event={"ID":"1556365b-47ac-4dc6-995c-60236a99c4cc","Type":"ContainerDied","Data":"eda7e4601f164d961b0ae9089dbd8ed3e0fe3465b08627304755d066c62cd60e"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.658486 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eda7e4601f164d961b0ae9089dbd8ed3e0fe3465b08627304755d066c62cd60e" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.658541 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n6cjq" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.660823 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k7pq8" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.669341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k7pq8" event={"ID":"645e3327-5337-40f6-b730-817d497cf5b8","Type":"ContainerDied","Data":"d2f3f0b76b5c35803640be762599e107c0e0221257e2a6e9c19d4340ff847e96"} Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.669390 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f3f0b76b5c35803640be762599e107c0e0221257e2a6e9c19d4340ff847e96" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.671601 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.005438288 podStartE2EDuration="29.67158032s" podCreationTimestamp="2025-12-04 10:01:00 +0000 UTC" firstStartedPulling="2025-12-04 10:01:01.224459742 +0000 UTC m=+1306.090940119" lastFinishedPulling="2025-12-04 10:01:28.890601774 +0000 UTC m=+1333.757082151" observedRunningTime="2025-12-04 10:01:29.669498195 +0000 UTC m=+1334.535978572" watchObservedRunningTime="2025-12-04 10:01:29.67158032 +0000 UTC m=+1334.538060697" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.676130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-config-data" (OuterVolumeSpecName: "config-data") pod "73bc5f13-b110-4ad4-a830-ed9d5e6d9800" (UID: "73bc5f13-b110-4ad4-a830-ed9d5e6d9800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.832647 4776 scope.go:117] "RemoveContainer" containerID="f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.837156 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bc5f13-b110-4ad4-a830-ed9d5e6d9800-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.880461 4776 scope.go:117] "RemoveContainer" containerID="3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab" Dec 04 10:01:29 crc kubenswrapper[4776]: I1204 10:01:29.918026 4776 scope.go:117] "RemoveContainer" containerID="acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.020027 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.038049 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.053814 4776 scope.go:117] "RemoveContainer" containerID="502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.054366 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6\": container with ID starting with 502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6 not found: ID does not exist" containerID="502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.054469 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6"} err="failed to get container status \"502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6\": rpc error: code = NotFound desc = could not find container \"502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6\": container with ID starting with 502299252f3b646a3e0c4b5279fe59a5735d88e121861df91a8439a98fb6cfe6 not found: ID does not exist" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.054572 4776 scope.go:117] "RemoveContainer" containerID="f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.054833 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.055221 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4\": container with ID starting with f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4 not found: ID does not exist" containerID="f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.055266 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4"} err="failed to get container status \"f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4\": rpc error: code = NotFound desc = could not find container \"f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4\": container with ID starting with f029957a3250bc13788523b36e610d85405d655b0da011b920f8a95ffdcef9a4 not found: ID does not exist" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.055294 4776 scope.go:117] "RemoveContainer" containerID="3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.055726 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="sg-core" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.055821 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="sg-core" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.055903 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="proxy-httpd" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.055988 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="proxy-httpd" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.056065 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1556365b-47ac-4dc6-995c-60236a99c4cc" containerName="mariadb-database-create" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.056137 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1556365b-47ac-4dc6-995c-60236a99c4cc" containerName="mariadb-database-create" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.056220 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-central-agent" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.056404 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-central-agent" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.056544 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-notification-agent" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.056638 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-notification-agent" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.056832 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732e1a2d-02d2-4754-9489-6bd42ba248e8" containerName="mariadb-account-create-update" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.056908 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="732e1a2d-02d2-4754-9489-6bd42ba248e8" containerName="mariadb-account-create-update" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.057077 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645e3327-5337-40f6-b730-817d497cf5b8" containerName="mariadb-database-create" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.057178 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="645e3327-5337-40f6-b730-817d497cf5b8" containerName="mariadb-database-create" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.057525 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="645e3327-5337-40f6-b730-817d497cf5b8" containerName="mariadb-database-create" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.057621 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="proxy-httpd" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.057713 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="732e1a2d-02d2-4754-9489-6bd42ba248e8" containerName="mariadb-account-create-update" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.058357 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1556365b-47ac-4dc6-995c-60236a99c4cc" containerName="mariadb-database-create" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.058452 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-notification-agent" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.058545 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="ceilometer-central-agent" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.058671 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" containerName="sg-core" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.056309 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab\": container with ID starting with 3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab not found: ID does not exist" containerID="3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.059439 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab"} err="failed to get container status \"3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab\": rpc error: code = NotFound desc = could not find container \"3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab\": container with ID starting with 3ad200896301bba0c67876a2e405ff60148d75b59ba535bee64b63da44daaeab not found: ID does not exist" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.059472 4776 scope.go:117] "RemoveContainer" containerID="acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9" Dec 04 10:01:30 crc kubenswrapper[4776]: E1204 10:01:30.060387 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9\": container with ID starting with acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9 not found: ID does not exist" containerID="acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.060423 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9"} err="failed to get container status \"acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9\": rpc error: code = NotFound desc = could not find container \"acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9\": container with ID starting with acaa35e0a12165eb84c65c19e1aec808539970384aa439b187c3b16ea047a1d9 not found: ID does not exist" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.061161 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.064817 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.065069 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.069531 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148503 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148577 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-scripts\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148621 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148658 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxj6n\" (UniqueName: \"kubernetes.io/projected/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-kube-api-access-hxj6n\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148723 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-config-data\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148770 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-log-httpd\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.148800 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-run-httpd\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509541 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509603 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxj6n\" (UniqueName: \"kubernetes.io/projected/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-kube-api-access-hxj6n\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509691 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-config-data\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509711 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-log-httpd\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509752 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-run-httpd\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509785 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.509833 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-scripts\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.510362 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-log-httpd\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.512405 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-run-httpd\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.518752 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.519072 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-config-data\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.522100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-scripts\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.540354 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.550061 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxj6n\" (UniqueName: \"kubernetes.io/projected/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-kube-api-access-hxj6n\") pod \"ceilometer-0\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.674360 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda9-account-create-update-mhwlb" event={"ID":"d57e217c-6f7f-4ccc-9083-db0620a54c8d","Type":"ContainerDied","Data":"3029d93b0750925b382b5cbf5c7aa16bba5f4b14165c511d569d34c445bb008d"} Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.674409 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3029d93b0750925b382b5cbf5c7aa16bba5f4b14165c511d569d34c445bb008d" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.680245 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-858g8" event={"ID":"747fc25a-59ce-428f-8459-6180355f4629","Type":"ContainerDied","Data":"36c8098d2b4105b5982d38fa1e04b0e04fd2121cf98fd1efd317f7e7e3b3f4a0"} Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.680283 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c8098d2b4105b5982d38fa1e04b0e04fd2121cf98fd1efd317f7e7e3b3f4a0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.682733 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" event={"ID":"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c","Type":"ContainerDied","Data":"af91bca4d7cc213b051eb95e8ca3d46d6978ccce283787f7f6f5301d80ade45e"} Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.683126 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af91bca4d7cc213b051eb95e8ca3d46d6978ccce283787f7f6f5301d80ade45e" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.690992 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.703592 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.714995 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.729347 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.830646 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lfhc\" (UniqueName: \"kubernetes.io/projected/747fc25a-59ce-428f-8459-6180355f4629-kube-api-access-7lfhc\") pod \"747fc25a-59ce-428f-8459-6180355f4629\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.830686 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57e217c-6f7f-4ccc-9083-db0620a54c8d-operator-scripts\") pod \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.830743 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdzzj\" (UniqueName: \"kubernetes.io/projected/d57e217c-6f7f-4ccc-9083-db0620a54c8d-kube-api-access-jdzzj\") pod \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\" (UID: \"d57e217c-6f7f-4ccc-9083-db0620a54c8d\") " Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.830918 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpctf\" (UniqueName: \"kubernetes.io/projected/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-kube-api-access-bpctf\") pod \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.830965 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-operator-scripts\") pod \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\" (UID: \"41998dcd-34d5-4335-a5fa-8e6ceb8aac4c\") " Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.831008 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747fc25a-59ce-428f-8459-6180355f4629-operator-scripts\") pod \"747fc25a-59ce-428f-8459-6180355f4629\" (UID: \"747fc25a-59ce-428f-8459-6180355f4629\") " Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.832117 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/747fc25a-59ce-428f-8459-6180355f4629-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "747fc25a-59ce-428f-8459-6180355f4629" (UID: "747fc25a-59ce-428f-8459-6180355f4629"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.840219 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747fc25a-59ce-428f-8459-6180355f4629-kube-api-access-7lfhc" (OuterVolumeSpecName: "kube-api-access-7lfhc") pod "747fc25a-59ce-428f-8459-6180355f4629" (UID: "747fc25a-59ce-428f-8459-6180355f4629"). InnerVolumeSpecName "kube-api-access-7lfhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.844116 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57e217c-6f7f-4ccc-9083-db0620a54c8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d57e217c-6f7f-4ccc-9083-db0620a54c8d" (UID: "d57e217c-6f7f-4ccc-9083-db0620a54c8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.844164 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57e217c-6f7f-4ccc-9083-db0620a54c8d-kube-api-access-jdzzj" (OuterVolumeSpecName: "kube-api-access-jdzzj") pod "d57e217c-6f7f-4ccc-9083-db0620a54c8d" (UID: "d57e217c-6f7f-4ccc-9083-db0620a54c8d"). InnerVolumeSpecName "kube-api-access-jdzzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.844492 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" (UID: "41998dcd-34d5-4335-a5fa-8e6ceb8aac4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:01:30 crc kubenswrapper[4776]: I1204 10:01:30.849147 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-kube-api-access-bpctf" (OuterVolumeSpecName: "kube-api-access-bpctf") pod "41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" (UID: "41998dcd-34d5-4335-a5fa-8e6ceb8aac4c"). InnerVolumeSpecName "kube-api-access-bpctf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.014318 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lfhc\" (UniqueName: \"kubernetes.io/projected/747fc25a-59ce-428f-8459-6180355f4629-kube-api-access-7lfhc\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.014341 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57e217c-6f7f-4ccc-9083-db0620a54c8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.014350 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdzzj\" (UniqueName: \"kubernetes.io/projected/d57e217c-6f7f-4ccc-9083-db0620a54c8d-kube-api-access-jdzzj\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.014359 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpctf\" (UniqueName: \"kubernetes.io/projected/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-kube-api-access-bpctf\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.014369 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.014377 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/747fc25a-59ce-428f-8459-6180355f4629-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.435613 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.465688 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bc5f13-b110-4ad4-a830-ed9d5e6d9800" path="/var/lib/kubelet/pods/73bc5f13-b110-4ad4-a830-ed9d5e6d9800/volumes" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.693449 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda9-account-create-update-mhwlb" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.693489 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-858g8" Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.693529 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerStarted","Data":"9ce59a85e492a6202e834110ccf7b0eab3b2a341d619c41473fe41e9bc37aee6"} Dec 04 10:01:31 crc kubenswrapper[4776]: I1204 10:01:31.693595 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d33-account-create-update-j4l8p" Dec 04 10:01:32 crc kubenswrapper[4776]: I1204 10:01:32.717806 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerStarted","Data":"12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab"} Dec 04 10:01:33 crc kubenswrapper[4776]: I1204 10:01:33.128006 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 10:01:33 crc kubenswrapper[4776]: I1204 10:01:33.730767 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerStarted","Data":"e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b"} Dec 04 10:01:33 crc kubenswrapper[4776]: I1204 10:01:33.731363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerStarted","Data":"d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f"} Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.245357 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8jnr4"] Dec 04 10:01:35 crc kubenswrapper[4776]: E1204 10:01:35.246181 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57e217c-6f7f-4ccc-9083-db0620a54c8d" containerName="mariadb-account-create-update" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.246201 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57e217c-6f7f-4ccc-9083-db0620a54c8d" containerName="mariadb-account-create-update" Dec 04 10:01:35 crc kubenswrapper[4776]: E1204 10:01:35.246229 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747fc25a-59ce-428f-8459-6180355f4629" containerName="mariadb-database-create" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.246238 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="747fc25a-59ce-428f-8459-6180355f4629" containerName="mariadb-database-create" Dec 04 10:01:35 crc kubenswrapper[4776]: E1204 10:01:35.246270 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" containerName="mariadb-account-create-update" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.246279 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" containerName="mariadb-account-create-update" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.246508 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57e217c-6f7f-4ccc-9083-db0620a54c8d" containerName="mariadb-account-create-update" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.246526 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="747fc25a-59ce-428f-8459-6180355f4629" containerName="mariadb-database-create" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.246542 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" containerName="mariadb-account-create-update" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.247361 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.249360 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.249823 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.251559 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9lbjw" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.260939 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8jnr4"] Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.409223 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.409286 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-config-data\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.409589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-scripts\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.409773 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkm5\" (UniqueName: \"kubernetes.io/projected/8ed96537-e4c0-433d-8b37-bf0e2c673816-kube-api-access-nqkm5\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.511748 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkm5\" (UniqueName: \"kubernetes.io/projected/8ed96537-e4c0-433d-8b37-bf0e2c673816-kube-api-access-nqkm5\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.511859 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.511893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-config-data\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.511977 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-scripts\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.517575 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.518008 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-scripts\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.518064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-config-data\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.532569 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkm5\" (UniqueName: \"kubernetes.io/projected/8ed96537-e4c0-433d-8b37-bf0e2c673816-kube-api-access-nqkm5\") pod \"nova-cell0-conductor-db-sync-8jnr4\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.573091 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.760425 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerStarted","Data":"b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2"} Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.760855 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:01:35 crc kubenswrapper[4776]: I1204 10:01:35.797126 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.650288921 podStartE2EDuration="5.797103796s" podCreationTimestamp="2025-12-04 10:01:30 +0000 UTC" firstStartedPulling="2025-12-04 10:01:31.441183348 +0000 UTC m=+1336.307663735" lastFinishedPulling="2025-12-04 10:01:34.587998233 +0000 UTC m=+1339.454478610" observedRunningTime="2025-12-04 10:01:35.784218721 +0000 UTC m=+1340.650699118" watchObservedRunningTime="2025-12-04 10:01:35.797103796 +0000 UTC m=+1340.663584173" Dec 04 10:01:36 crc kubenswrapper[4776]: I1204 10:01:36.126710 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8jnr4"] Dec 04 10:01:36 crc kubenswrapper[4776]: W1204 10:01:36.131343 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ed96537_e4c0_433d_8b37_bf0e2c673816.slice/crio-2a890c649571f5aaec0b713056cd9c5a39dbf7d937ccc3ad957b12f4c7aa4c79 WatchSource:0}: Error finding container 2a890c649571f5aaec0b713056cd9c5a39dbf7d937ccc3ad957b12f4c7aa4c79: Status 404 returned error can't find the container with id 2a890c649571f5aaec0b713056cd9c5a39dbf7d937ccc3ad957b12f4c7aa4c79 Dec 04 10:01:36 crc kubenswrapper[4776]: I1204 10:01:36.774066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" event={"ID":"8ed96537-e4c0-433d-8b37-bf0e2c673816","Type":"ContainerStarted","Data":"2a890c649571f5aaec0b713056cd9c5a39dbf7d937ccc3ad957b12f4c7aa4c79"} Dec 04 10:01:46 crc kubenswrapper[4776]: I1204 10:01:46.871761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" event={"ID":"8ed96537-e4c0-433d-8b37-bf0e2c673816","Type":"ContainerStarted","Data":"d00caf7ed44f8322e580717ac80cec1f1525d8cac5270dfa8652f866b238a26d"} Dec 04 10:01:46 crc kubenswrapper[4776]: I1204 10:01:46.926452 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" podStartSLOduration=1.838708307 podStartE2EDuration="11.926419916s" podCreationTimestamp="2025-12-04 10:01:35 +0000 UTC" firstStartedPulling="2025-12-04 10:01:36.134002563 +0000 UTC m=+1341.000482940" lastFinishedPulling="2025-12-04 10:01:46.221714172 +0000 UTC m=+1351.088194549" observedRunningTime="2025-12-04 10:01:46.895178833 +0000 UTC m=+1351.761659210" watchObservedRunningTime="2025-12-04 10:01:46.926419916 +0000 UTC m=+1351.792900293" Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.380325 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.380910 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.381024 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.382160 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5787add21e877423617566cc01fd1cd5d93ab12b7726098df3a77184a49fa270"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.382261 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://5787add21e877423617566cc01fd1cd5d93ab12b7726098df3a77184a49fa270" gracePeriod=600 Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.910539 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="5787add21e877423617566cc01fd1cd5d93ab12b7726098df3a77184a49fa270" exitCode=0 Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.910628 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"5787add21e877423617566cc01fd1cd5d93ab12b7726098df3a77184a49fa270"} Dec 04 10:01:49 crc kubenswrapper[4776]: I1204 10:01:49.911291 4776 scope.go:117] "RemoveContainer" containerID="e01a20d48aa8f7249b057929edbda0928b81534859b7bbd6d1f1ff0ee5da05c8" Dec 04 10:01:50 crc kubenswrapper[4776]: I1204 10:01:50.924139 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96"} Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.287097 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.287976 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-central-agent" containerID="cri-o://12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab" gracePeriod=30 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.288123 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="proxy-httpd" containerID="cri-o://b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2" gracePeriod=30 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.288168 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="sg-core" containerID="cri-o://e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b" gracePeriod=30 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.288209 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-notification-agent" containerID="cri-o://d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f" gracePeriod=30 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.299773 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": EOF" Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.962124 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerID="b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2" exitCode=0 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.962441 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerID="e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b" exitCode=2 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.962456 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerID="12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab" exitCode=0 Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.962250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerDied","Data":"b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2"} Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.962495 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerDied","Data":"e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b"} Dec 04 10:01:53 crc kubenswrapper[4776]: I1204 10:01:53.962511 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerDied","Data":"12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab"} Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.327953 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.417804 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-sg-core-conf-yaml\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.417859 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-combined-ca-bundle\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.417905 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-log-httpd\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.417971 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-run-httpd\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.418012 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-config-data\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.418060 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-scripts\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.418173 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxj6n\" (UniqueName: \"kubernetes.io/projected/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-kube-api-access-hxj6n\") pod \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\" (UID: \"0c7c7242-2681-4ad6-9ae8-4b917638ccd9\") " Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.418577 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.418646 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.423972 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-scripts" (OuterVolumeSpecName: "scripts") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.424060 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-kube-api-access-hxj6n" (OuterVolumeSpecName: "kube-api-access-hxj6n") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "kube-api-access-hxj6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.446025 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.500753 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.511586 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-config-data" (OuterVolumeSpecName: "config-data") pod "0c7c7242-2681-4ad6-9ae8-4b917638ccd9" (UID: "0c7c7242-2681-4ad6-9ae8-4b917638ccd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.519892 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.520042 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.520186 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.520233 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxj6n\" (UniqueName: \"kubernetes.io/projected/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-kube-api-access-hxj6n\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.520250 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.520263 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.520276 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c7c7242-2681-4ad6-9ae8-4b917638ccd9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.976700 4776 generic.go:334] "Generic (PLEG): container finished" podID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerID="d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f" exitCode=0 Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.976740 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerDied","Data":"d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f"} Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.976764 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.976774 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c7c7242-2681-4ad6-9ae8-4b917638ccd9","Type":"ContainerDied","Data":"9ce59a85e492a6202e834110ccf7b0eab3b2a341d619c41473fe41e9bc37aee6"} Dec 04 10:01:54 crc kubenswrapper[4776]: I1204 10:01:54.976793 4776 scope.go:117] "RemoveContainer" containerID="b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.014262 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.014363 4776 scope.go:117] "RemoveContainer" containerID="e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.024658 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050006 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.050515 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="proxy-httpd" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050538 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="proxy-httpd" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.050550 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-central-agent" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050559 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-central-agent" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.050594 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="sg-core" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050601 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="sg-core" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.050612 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-notification-agent" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050619 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-notification-agent" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050813 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-central-agent" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050828 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="ceilometer-notification-agent" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050846 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="proxy-httpd" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.050865 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" containerName="sg-core" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.052726 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.055080 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.055564 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.056269 4776 scope.go:117] "RemoveContainer" containerID="d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.073404 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.085242 4776 scope.go:117] "RemoveContainer" containerID="12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.106040 4776 scope.go:117] "RemoveContainer" containerID="b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.106584 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2\": container with ID starting with b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2 not found: ID does not exist" containerID="b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.106621 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2"} err="failed to get container status \"b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2\": rpc error: code = NotFound desc = could not find container \"b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2\": container with ID starting with b2fe61d297069c117db574598f8d0cb385e6d7b615e59f2dd0ca820a8f7489f2 not found: ID does not exist" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.106645 4776 scope.go:117] "RemoveContainer" containerID="e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.106886 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b\": container with ID starting with e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b not found: ID does not exist" containerID="e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.106926 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b"} err="failed to get container status \"e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b\": rpc error: code = NotFound desc = could not find container \"e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b\": container with ID starting with e42e9f60b9802f4c399c2eccf10a3ee259bb54cc14a80024fe7be7fb141f344b not found: ID does not exist" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.106947 4776 scope.go:117] "RemoveContainer" containerID="d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.107187 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f\": container with ID starting with d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f not found: ID does not exist" containerID="d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.107208 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f"} err="failed to get container status \"d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f\": rpc error: code = NotFound desc = could not find container \"d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f\": container with ID starting with d7c9a69fd883300050fdaa7f429482cf75f683fa88d86236524b49edd5b42e9f not found: ID does not exist" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.107222 4776 scope.go:117] "RemoveContainer" containerID="12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab" Dec 04 10:01:55 crc kubenswrapper[4776]: E1204 10:01:55.107644 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab\": container with ID starting with 12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab not found: ID does not exist" containerID="12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.107664 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab"} err="failed to get container status \"12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab\": rpc error: code = NotFound desc = could not find container \"12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab\": container with ID starting with 12a9a17e6911948fc02a4899ff4a797c2e6ca997dacfda2edeb1b3105dbd2aab not found: ID does not exist" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.136819 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-config-data\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.137222 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.137427 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxjx8\" (UniqueName: \"kubernetes.io/projected/65ec0795-dc16-460c-8307-0b5864ff9a59-kube-api-access-qxjx8\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.137492 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-scripts\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.137551 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.137624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-run-httpd\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.137699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-log-httpd\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.239305 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-scripts\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.239658 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.239774 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-run-httpd\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.239899 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-log-httpd\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.240065 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-config-data\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.240216 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.240368 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-run-httpd\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.240422 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-log-httpd\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.240623 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxjx8\" (UniqueName: \"kubernetes.io/projected/65ec0795-dc16-460c-8307-0b5864ff9a59-kube-api-access-qxjx8\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.244554 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-scripts\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.244666 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-config-data\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.245076 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.245352 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.259287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxjx8\" (UniqueName: \"kubernetes.io/projected/65ec0795-dc16-460c-8307-0b5864ff9a59-kube-api-access-qxjx8\") pod \"ceilometer-0\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.370485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.467744 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7c7242-2681-4ad6-9ae8-4b917638ccd9" path="/var/lib/kubelet/pods/0c7c7242-2681-4ad6-9ae8-4b917638ccd9/volumes" Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.851043 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:01:55 crc kubenswrapper[4776]: W1204 10:01:55.854470 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ec0795_dc16_460c_8307_0b5864ff9a59.slice/crio-623d31482192ab0496706da2897e895c8a5e98865b92f255b9fb85a18e7b6699 WatchSource:0}: Error finding container 623d31482192ab0496706da2897e895c8a5e98865b92f255b9fb85a18e7b6699: Status 404 returned error can't find the container with id 623d31482192ab0496706da2897e895c8a5e98865b92f255b9fb85a18e7b6699 Dec 04 10:01:55 crc kubenswrapper[4776]: I1204 10:01:55.988310 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerStarted","Data":"623d31482192ab0496706da2897e895c8a5e98865b92f255b9fb85a18e7b6699"} Dec 04 10:01:57 crc kubenswrapper[4776]: I1204 10:01:57.001542 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerStarted","Data":"c79c4bde16c46d67255fd17da9687de91ca8d6b376f9f7084c9d1fdb22660815"} Dec 04 10:01:58 crc kubenswrapper[4776]: I1204 10:01:58.042560 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerStarted","Data":"39cf2f4a517dac058be8aa8acaf7c5c3bd75315b8dda4b3ff854eb810933bf36"} Dec 04 10:01:59 crc kubenswrapper[4776]: I1204 10:01:59.059626 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerStarted","Data":"25700ead5b3aab6a01af98c4dbe8918c0a2a8497cae47ac55f04022a90a2651a"} Dec 04 10:02:00 crc kubenswrapper[4776]: I1204 10:02:00.074632 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerStarted","Data":"b669ea7a1befe4f0f3848a9eafa4fd1f6de1991f2555d517d3510c0d0e3e6bce"} Dec 04 10:02:00 crc kubenswrapper[4776]: I1204 10:02:00.075222 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:02:00 crc kubenswrapper[4776]: I1204 10:02:00.107520 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.792140531 podStartE2EDuration="5.10750467s" podCreationTimestamp="2025-12-04 10:01:55 +0000 UTC" firstStartedPulling="2025-12-04 10:01:55.857300058 +0000 UTC m=+1360.723780435" lastFinishedPulling="2025-12-04 10:01:59.172664197 +0000 UTC m=+1364.039144574" observedRunningTime="2025-12-04 10:02:00.104395262 +0000 UTC m=+1364.970875659" watchObservedRunningTime="2025-12-04 10:02:00.10750467 +0000 UTC m=+1364.973985047" Dec 04 10:02:01 crc kubenswrapper[4776]: I1204 10:02:01.085002 4776 generic.go:334] "Generic (PLEG): container finished" podID="8ed96537-e4c0-433d-8b37-bf0e2c673816" containerID="d00caf7ed44f8322e580717ac80cec1f1525d8cac5270dfa8652f866b238a26d" exitCode=0 Dec 04 10:02:01 crc kubenswrapper[4776]: I1204 10:02:01.085199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" event={"ID":"8ed96537-e4c0-433d-8b37-bf0e2c673816","Type":"ContainerDied","Data":"d00caf7ed44f8322e580717ac80cec1f1525d8cac5270dfa8652f866b238a26d"} Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.532102 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.601147 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-scripts\") pod \"8ed96537-e4c0-433d-8b37-bf0e2c673816\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.601261 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkm5\" (UniqueName: \"kubernetes.io/projected/8ed96537-e4c0-433d-8b37-bf0e2c673816-kube-api-access-nqkm5\") pod \"8ed96537-e4c0-433d-8b37-bf0e2c673816\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.601301 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-combined-ca-bundle\") pod \"8ed96537-e4c0-433d-8b37-bf0e2c673816\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.601580 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-config-data\") pod \"8ed96537-e4c0-433d-8b37-bf0e2c673816\" (UID: \"8ed96537-e4c0-433d-8b37-bf0e2c673816\") " Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.610418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed96537-e4c0-433d-8b37-bf0e2c673816-kube-api-access-nqkm5" (OuterVolumeSpecName: "kube-api-access-nqkm5") pod "8ed96537-e4c0-433d-8b37-bf0e2c673816" (UID: "8ed96537-e4c0-433d-8b37-bf0e2c673816"). InnerVolumeSpecName "kube-api-access-nqkm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.610685 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-scripts" (OuterVolumeSpecName: "scripts") pod "8ed96537-e4c0-433d-8b37-bf0e2c673816" (UID: "8ed96537-e4c0-433d-8b37-bf0e2c673816"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.634737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ed96537-e4c0-433d-8b37-bf0e2c673816" (UID: "8ed96537-e4c0-433d-8b37-bf0e2c673816"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.637930 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-config-data" (OuterVolumeSpecName: "config-data") pod "8ed96537-e4c0-433d-8b37-bf0e2c673816" (UID: "8ed96537-e4c0-433d-8b37-bf0e2c673816"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.703864 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.704194 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.704284 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkm5\" (UniqueName: \"kubernetes.io/projected/8ed96537-e4c0-433d-8b37-bf0e2c673816-kube-api-access-nqkm5\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:02 crc kubenswrapper[4776]: I1204 10:02:02.704394 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed96537-e4c0-433d-8b37-bf0e2c673816-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.103363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" event={"ID":"8ed96537-e4c0-433d-8b37-bf0e2c673816","Type":"ContainerDied","Data":"2a890c649571f5aaec0b713056cd9c5a39dbf7d937ccc3ad957b12f4c7aa4c79"} Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.103404 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a890c649571f5aaec0b713056cd9c5a39dbf7d937ccc3ad957b12f4c7aa4c79" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.103470 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8jnr4" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.254931 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:02:03 crc kubenswrapper[4776]: E1204 10:02:03.255359 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed96537-e4c0-433d-8b37-bf0e2c673816" containerName="nova-cell0-conductor-db-sync" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.255378 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed96537-e4c0-433d-8b37-bf0e2c673816" containerName="nova-cell0-conductor-db-sync" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.255623 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed96537-e4c0-433d-8b37-bf0e2c673816" containerName="nova-cell0-conductor-db-sync" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.256349 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.258358 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9lbjw" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.260437 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.313253 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.327975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.328210 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.328775 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlfq7\" (UniqueName: \"kubernetes.io/projected/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-kube-api-access-jlfq7\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.437264 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlfq7\" (UniqueName: \"kubernetes.io/projected/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-kube-api-access-jlfq7\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.437408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.437618 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.446346 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.448074 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.475779 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlfq7\" (UniqueName: \"kubernetes.io/projected/a5ae80cb-2f88-4707-aa4a-777b2d4e3b99-kube-api-access-jlfq7\") pod \"nova-cell0-conductor-0\" (UID: \"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99\") " pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:03 crc kubenswrapper[4776]: I1204 10:02:03.577120 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:04 crc kubenswrapper[4776]: I1204 10:02:04.108778 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 10:02:05 crc kubenswrapper[4776]: I1204 10:02:05.126412 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99","Type":"ContainerStarted","Data":"63b176018a3ff562de5fae98ab127120edf3ef2b6b67cdf9927c9356a9aeb5bd"} Dec 04 10:02:05 crc kubenswrapper[4776]: I1204 10:02:05.127019 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5ae80cb-2f88-4707-aa4a-777b2d4e3b99","Type":"ContainerStarted","Data":"17aedc7a54099098d3f916eaa15e8bd0a9173a873b7ef18cf4c20cf8fb82f127"} Dec 04 10:02:05 crc kubenswrapper[4776]: I1204 10:02:05.127082 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:05 crc kubenswrapper[4776]: I1204 10:02:05.162700 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.16268194 podStartE2EDuration="2.16268194s" podCreationTimestamp="2025-12-04 10:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:05.151590583 +0000 UTC m=+1370.018070990" watchObservedRunningTime="2025-12-04 10:02:05.16268194 +0000 UTC m=+1370.029162327" Dec 04 10:02:13 crc kubenswrapper[4776]: I1204 10:02:13.603530 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.125148 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h9cq9"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.126987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.128941 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.129135 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.139673 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9cq9"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.216048 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brkxp\" (UniqueName: \"kubernetes.io/projected/9ae6f2de-abd7-4410-b165-82a134e89e93-kube-api-access-brkxp\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.216312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-scripts\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.216369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.216484 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-config-data\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.318534 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-config-data\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.318640 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brkxp\" (UniqueName: \"kubernetes.io/projected/9ae6f2de-abd7-4410-b165-82a134e89e93-kube-api-access-brkxp\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.318710 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-scripts\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.318729 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.330348 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-config-data\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.347860 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.362540 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-scripts\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.373249 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.374831 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.375055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brkxp\" (UniqueName: \"kubernetes.io/projected/9ae6f2de-abd7-4410-b165-82a134e89e93-kube-api-access-brkxp\") pod \"nova-cell0-cell-mapping-h9cq9\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.379398 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.408344 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.418422 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.423370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4478\" (UniqueName: \"kubernetes.io/projected/f995b914-c112-421f-b350-c8e3c4e029f1-kube-api-access-q4478\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.423702 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.423958 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.427010 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.429657 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.459959 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.473377 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.518763 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.526469 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.530856 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.554100 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.554657 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-config-data\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.554756 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.554884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8dx\" (UniqueName: \"kubernetes.io/projected/82262e7e-617d-4583-b2d7-db358f9574f7-kube-api-access-qr8dx\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.554906 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4478\" (UniqueName: \"kubernetes.io/projected/f995b914-c112-421f-b350-c8e3c4e029f1-kube-api-access-q4478\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.560809 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.560895 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82262e7e-617d-4583-b2d7-db358f9574f7-logs\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.560959 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.572391 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.599789 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-hkg68"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.601482 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.602287 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.627899 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4478\" (UniqueName: \"kubernetes.io/projected/f995b914-c112-421f-b350-c8e3c4e029f1-kube-api-access-q4478\") pod \"nova-cell1-novncproxy-0\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.678970 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2157cbd-6aa5-4b54-995e-c528d86c3f83-logs\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.680244 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfw6t\" (UniqueName: \"kubernetes.io/projected/a2157cbd-6aa5-4b54-995e-c528d86c3f83-kube-api-access-tfw6t\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.680444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8dx\" (UniqueName: \"kubernetes.io/projected/82262e7e-617d-4583-b2d7-db358f9574f7-kube-api-access-qr8dx\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.680551 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.680588 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82262e7e-617d-4583-b2d7-db358f9574f7-logs\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.682517 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-config-data\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.682592 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.682632 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-config-data\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.686383 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82262e7e-617d-4583-b2d7-db358f9574f7-logs\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.694227 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.703130 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.705547 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.709342 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-config-data\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.718324 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8dx\" (UniqueName: \"kubernetes.io/projected/82262e7e-617d-4583-b2d7-db358f9574f7-kube-api-access-qr8dx\") pod \"nova-metadata-0\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.721752 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.775661 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-hkg68"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784750 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-config-data\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784836 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-config\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784864 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784928 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-config-data\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784950 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttb6\" (UniqueName: \"kubernetes.io/projected/c11e0ae0-1bd1-45f9-9baa-08d33108d638-kube-api-access-2ttb6\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784968 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.784987 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-dns-svc\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.785037 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2157cbd-6aa5-4b54-995e-c528d86c3f83-logs\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.785058 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.785090 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfw6t\" (UniqueName: \"kubernetes.io/projected/a2157cbd-6aa5-4b54-995e-c528d86c3f83-kube-api-access-tfw6t\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.785128 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c5l\" (UniqueName: \"kubernetes.io/projected/e63d098b-b030-47b1-be04-01e6de4d6cc9-kube-api-access-46c5l\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.785150 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.786847 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2157cbd-6aa5-4b54-995e-c528d86c3f83-logs\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.790860 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.791878 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-config-data\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.795464 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.809522 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.815054 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfw6t\" (UniqueName: \"kubernetes.io/projected/a2157cbd-6aa5-4b54-995e-c528d86c3f83-kube-api-access-tfw6t\") pod \"nova-api-0\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " pod="openstack/nova-api-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.830669 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.887388 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttb6\" (UniqueName: \"kubernetes.io/projected/c11e0ae0-1bd1-45f9-9baa-08d33108d638-kube-api-access-2ttb6\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.887988 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-dns-svc\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.890034 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.890279 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46c5l\" (UniqueName: \"kubernetes.io/projected/e63d098b-b030-47b1-be04-01e6de4d6cc9-kube-api-access-46c5l\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.890326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.890437 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-config-data\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.890568 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-config\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.890664 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.901424 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-config-data\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.902350 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.902392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.903131 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.905519 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-config\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.905584 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-dns-svc\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.920219 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c5l\" (UniqueName: \"kubernetes.io/projected/e63d098b-b030-47b1-be04-01e6de4d6cc9-kube-api-access-46c5l\") pod \"dnsmasq-dns-566b5b7845-hkg68\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:14 crc kubenswrapper[4776]: I1204 10:02:14.921327 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttb6\" (UniqueName: \"kubernetes.io/projected/c11e0ae0-1bd1-45f9-9baa-08d33108d638-kube-api-access-2ttb6\") pod \"nova-scheduler-0\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.010868 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.043663 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.044100 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.102357 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9cq9"] Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.313200 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9cq9" event={"ID":"9ae6f2de-abd7-4410-b165-82a134e89e93","Type":"ContainerStarted","Data":"b27aab6dd359f6664667284bddb120c6623e10e30972a6018c10d96975999ccd"} Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.618792 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.667950 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.912598 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wpwrq"] Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.914275 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.923203 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.923485 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 10:02:15 crc kubenswrapper[4776]: I1204 10:02:15.951061 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wpwrq"] Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.000044 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.070777 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.106213 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfchv\" (UniqueName: \"kubernetes.io/projected/d9ea013b-0a00-4108-9e20-ac957fdbf524-kube-api-access-lfchv\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.106301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-scripts\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.106401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.106428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-config-data\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.208222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfchv\" (UniqueName: \"kubernetes.io/projected/d9ea013b-0a00-4108-9e20-ac957fdbf524-kube-api-access-lfchv\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.208282 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-scripts\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.208373 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.208400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-config-data\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.214460 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-scripts\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.215214 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-config-data\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.220139 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.250624 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfchv\" (UniqueName: \"kubernetes.io/projected/d9ea013b-0a00-4108-9e20-ac957fdbf524-kube-api-access-lfchv\") pod \"nova-cell1-conductor-db-sync-wpwrq\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.251695 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.256370 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.267691 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-hkg68"] Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.351419 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2157cbd-6aa5-4b54-995e-c528d86c3f83","Type":"ContainerStarted","Data":"fc823422d28dc34c6eb288b324ad449770d4e17dddc73fc5fde9ce76a826742a"} Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.354747 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c11e0ae0-1bd1-45f9-9baa-08d33108d638","Type":"ContainerStarted","Data":"b466ca9f68302149011f544d0848b180b6a443d925fadfcddae6c47e82f82e00"} Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.357225 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82262e7e-617d-4583-b2d7-db358f9574f7","Type":"ContainerStarted","Data":"8befd07950af47c51457133e2640372237339786c2cf896b74a94d19c0796304"} Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.362731 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f995b914-c112-421f-b350-c8e3c4e029f1","Type":"ContainerStarted","Data":"9496de1d350228349f7148f823784d510674ceab2d56c142630757e3819aa546"} Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.372815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9cq9" event={"ID":"9ae6f2de-abd7-4410-b165-82a134e89e93","Type":"ContainerStarted","Data":"f0eb881180561bed3f783cfd79feaeae18711639ead36828bc866e32475aee2b"} Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.378181 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" event={"ID":"e63d098b-b030-47b1-be04-01e6de4d6cc9","Type":"ContainerStarted","Data":"acf9ddfb34f34654e81d3461a4ff6766ae7672b657c09a791b937541593fd46a"} Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.406372 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h9cq9" podStartSLOduration=2.406344958 podStartE2EDuration="2.406344958s" podCreationTimestamp="2025-12-04 10:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:16.402382844 +0000 UTC m=+1381.268863241" watchObservedRunningTime="2025-12-04 10:02:16.406344958 +0000 UTC m=+1381.272825335" Dec 04 10:02:16 crc kubenswrapper[4776]: I1204 10:02:16.840844 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wpwrq"] Dec 04 10:02:17 crc kubenswrapper[4776]: I1204 10:02:17.395420 4776 generic.go:334] "Generic (PLEG): container finished" podID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerID="baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3" exitCode=0 Dec 04 10:02:17 crc kubenswrapper[4776]: I1204 10:02:17.395533 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" event={"ID":"e63d098b-b030-47b1-be04-01e6de4d6cc9","Type":"ContainerDied","Data":"baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3"} Dec 04 10:02:17 crc kubenswrapper[4776]: I1204 10:02:17.402639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" event={"ID":"d9ea013b-0a00-4108-9e20-ac957fdbf524","Type":"ContainerStarted","Data":"67ee3f123aa0b70b77f3a9940f957cd1fde7c13804b6ff9f1b7add78f5ab4540"} Dec 04 10:02:17 crc kubenswrapper[4776]: I1204 10:02:17.402713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" event={"ID":"d9ea013b-0a00-4108-9e20-ac957fdbf524","Type":"ContainerStarted","Data":"d50b2b2e04112d8c49e96154e285f04f79d8906eefb1b0be5b52977a9c781aab"} Dec 04 10:02:17 crc kubenswrapper[4776]: I1204 10:02:17.449703 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" podStartSLOduration=2.449637466 podStartE2EDuration="2.449637466s" podCreationTimestamp="2025-12-04 10:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:17.445683613 +0000 UTC m=+1382.312164000" watchObservedRunningTime="2025-12-04 10:02:17.449637466 +0000 UTC m=+1382.316117843" Dec 04 10:02:18 crc kubenswrapper[4776]: I1204 10:02:18.410039 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:18 crc kubenswrapper[4776]: I1204 10:02:18.443740 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.454984 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82262e7e-617d-4583-b2d7-db358f9574f7","Type":"ContainerStarted","Data":"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.455552 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82262e7e-617d-4583-b2d7-db358f9574f7","Type":"ContainerStarted","Data":"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.455310 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-metadata" containerID="cri-o://a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d" gracePeriod=30 Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.455046 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-log" containerID="cri-o://35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae" gracePeriod=30 Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.458939 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f995b914-c112-421f-b350-c8e3c4e029f1","Type":"ContainerStarted","Data":"f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.459104 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f995b914-c112-421f-b350-c8e3c4e029f1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d" gracePeriod=30 Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.463085 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" event={"ID":"e63d098b-b030-47b1-be04-01e6de4d6cc9","Type":"ContainerStarted","Data":"4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.463933 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.467750 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2157cbd-6aa5-4b54-995e-c528d86c3f83","Type":"ContainerStarted","Data":"3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.467785 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2157cbd-6aa5-4b54-995e-c528d86c3f83","Type":"ContainerStarted","Data":"76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.470017 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c11e0ae0-1bd1-45f9-9baa-08d33108d638","Type":"ContainerStarted","Data":"11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8"} Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.490550 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.002447424 podStartE2EDuration="6.490530655s" podCreationTimestamp="2025-12-04 10:02:14 +0000 UTC" firstStartedPulling="2025-12-04 10:02:15.667625937 +0000 UTC m=+1380.534106314" lastFinishedPulling="2025-12-04 10:02:19.155709168 +0000 UTC m=+1384.022189545" observedRunningTime="2025-12-04 10:02:20.48622798 +0000 UTC m=+1385.352708357" watchObservedRunningTime="2025-12-04 10:02:20.490530655 +0000 UTC m=+1385.357011032" Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.504552 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.266375898 podStartE2EDuration="6.504535123s" podCreationTimestamp="2025-12-04 10:02:14 +0000 UTC" firstStartedPulling="2025-12-04 10:02:15.984324604 +0000 UTC m=+1380.850804981" lastFinishedPulling="2025-12-04 10:02:19.222483829 +0000 UTC m=+1384.088964206" observedRunningTime="2025-12-04 10:02:20.503667396 +0000 UTC m=+1385.370147773" watchObservedRunningTime="2025-12-04 10:02:20.504535123 +0000 UTC m=+1385.371015500" Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.525270 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" podStartSLOduration=6.525030705 podStartE2EDuration="6.525030705s" podCreationTimestamp="2025-12-04 10:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:20.519390168 +0000 UTC m=+1385.385870565" watchObservedRunningTime="2025-12-04 10:02:20.525030705 +0000 UTC m=+1385.391511082" Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.542300 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.37148881 podStartE2EDuration="6.542282145s" podCreationTimestamp="2025-12-04 10:02:14 +0000 UTC" firstStartedPulling="2025-12-04 10:02:16.051726905 +0000 UTC m=+1380.918207292" lastFinishedPulling="2025-12-04 10:02:19.22252025 +0000 UTC m=+1384.089000627" observedRunningTime="2025-12-04 10:02:20.535362368 +0000 UTC m=+1385.401842765" watchObservedRunningTime="2025-12-04 10:02:20.542282145 +0000 UTC m=+1385.408762522" Dec 04 10:02:20 crc kubenswrapper[4776]: I1204 10:02:20.561372 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.5590903640000002 podStartE2EDuration="6.561352572s" podCreationTimestamp="2025-12-04 10:02:14 +0000 UTC" firstStartedPulling="2025-12-04 10:02:16.255839616 +0000 UTC m=+1381.122319993" lastFinishedPulling="2025-12-04 10:02:19.258101814 +0000 UTC m=+1384.124582201" observedRunningTime="2025-12-04 10:02:20.551503494 +0000 UTC m=+1385.417983901" watchObservedRunningTime="2025-12-04 10:02:20.561352572 +0000 UTC m=+1385.427832949" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.053154 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.175396 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82262e7e-617d-4583-b2d7-db358f9574f7-logs\") pod \"82262e7e-617d-4583-b2d7-db358f9574f7\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.175514 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-config-data\") pod \"82262e7e-617d-4583-b2d7-db358f9574f7\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.176203 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82262e7e-617d-4583-b2d7-db358f9574f7-logs" (OuterVolumeSpecName: "logs") pod "82262e7e-617d-4583-b2d7-db358f9574f7" (UID: "82262e7e-617d-4583-b2d7-db358f9574f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.176300 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr8dx\" (UniqueName: \"kubernetes.io/projected/82262e7e-617d-4583-b2d7-db358f9574f7-kube-api-access-qr8dx\") pod \"82262e7e-617d-4583-b2d7-db358f9574f7\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.176472 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-combined-ca-bundle\") pod \"82262e7e-617d-4583-b2d7-db358f9574f7\" (UID: \"82262e7e-617d-4583-b2d7-db358f9574f7\") " Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.177633 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82262e7e-617d-4583-b2d7-db358f9574f7-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.182728 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82262e7e-617d-4583-b2d7-db358f9574f7-kube-api-access-qr8dx" (OuterVolumeSpecName: "kube-api-access-qr8dx") pod "82262e7e-617d-4583-b2d7-db358f9574f7" (UID: "82262e7e-617d-4583-b2d7-db358f9574f7"). InnerVolumeSpecName "kube-api-access-qr8dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.203381 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-config-data" (OuterVolumeSpecName: "config-data") pod "82262e7e-617d-4583-b2d7-db358f9574f7" (UID: "82262e7e-617d-4583-b2d7-db358f9574f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.219176 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82262e7e-617d-4583-b2d7-db358f9574f7" (UID: "82262e7e-617d-4583-b2d7-db358f9574f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.279253 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.279291 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr8dx\" (UniqueName: \"kubernetes.io/projected/82262e7e-617d-4583-b2d7-db358f9574f7-kube-api-access-qr8dx\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.279303 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82262e7e-617d-4583-b2d7-db358f9574f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.477702 4776 generic.go:334] "Generic (PLEG): container finished" podID="82262e7e-617d-4583-b2d7-db358f9574f7" containerID="a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d" exitCode=0 Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.478034 4776 generic.go:334] "Generic (PLEG): container finished" podID="82262e7e-617d-4583-b2d7-db358f9574f7" containerID="35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae" exitCode=143 Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.478035 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82262e7e-617d-4583-b2d7-db358f9574f7","Type":"ContainerDied","Data":"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d"} Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.478074 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82262e7e-617d-4583-b2d7-db358f9574f7","Type":"ContainerDied","Data":"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae"} Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.478118 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82262e7e-617d-4583-b2d7-db358f9574f7","Type":"ContainerDied","Data":"8befd07950af47c51457133e2640372237339786c2cf896b74a94d19c0796304"} Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.478140 4776 scope.go:117] "RemoveContainer" containerID="a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.477969 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.515044 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.523869 4776 scope.go:117] "RemoveContainer" containerID="35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.534874 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.548300 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:21 crc kubenswrapper[4776]: E1204 10:02:21.549164 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-metadata" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.549192 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-metadata" Dec 04 10:02:21 crc kubenswrapper[4776]: E1204 10:02:21.549211 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-log" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.549221 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-log" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.549456 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-metadata" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.549497 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" containerName="nova-metadata-log" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.562060 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.562228 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.566373 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.567939 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.603147 4776 scope.go:117] "RemoveContainer" containerID="a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d" Dec 04 10:02:21 crc kubenswrapper[4776]: E1204 10:02:21.603945 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d\": container with ID starting with a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d not found: ID does not exist" containerID="a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.604097 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d"} err="failed to get container status \"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d\": rpc error: code = NotFound desc = could not find container \"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d\": container with ID starting with a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d not found: ID does not exist" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.604189 4776 scope.go:117] "RemoveContainer" containerID="35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae" Dec 04 10:02:21 crc kubenswrapper[4776]: E1204 10:02:21.604588 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae\": container with ID starting with 35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae not found: ID does not exist" containerID="35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.604627 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae"} err="failed to get container status \"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae\": rpc error: code = NotFound desc = could not find container \"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae\": container with ID starting with 35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae not found: ID does not exist" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.604649 4776 scope.go:117] "RemoveContainer" containerID="a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.604900 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d"} err="failed to get container status \"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d\": rpc error: code = NotFound desc = could not find container \"a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d\": container with ID starting with a772cba665d28a5d0c287f1a881ebfec0838f0a3dce6624ada38cb73a4e4586d not found: ID does not exist" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.604949 4776 scope.go:117] "RemoveContainer" containerID="35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.605212 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae"} err="failed to get container status \"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae\": rpc error: code = NotFound desc = could not find container \"35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae\": container with ID starting with 35507223237bc292a27a3c0367674f704669409ea1f1d7fae028cf22962534ae not found: ID does not exist" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.699491 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.699558 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-logs\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.699630 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.699727 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx8t7\" (UniqueName: \"kubernetes.io/projected/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-kube-api-access-lx8t7\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.700206 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-config-data\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.803089 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-config-data\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.803215 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.803257 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-logs\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.803288 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.803380 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx8t7\" (UniqueName: \"kubernetes.io/projected/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-kube-api-access-lx8t7\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.803982 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-logs\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.808776 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.809009 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-config-data\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.809890 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.838531 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx8t7\" (UniqueName: \"kubernetes.io/projected/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-kube-api-access-lx8t7\") pod \"nova-metadata-0\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " pod="openstack/nova-metadata-0" Dec 04 10:02:21 crc kubenswrapper[4776]: I1204 10:02:21.880250 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:22 crc kubenswrapper[4776]: I1204 10:02:22.327183 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:22 crc kubenswrapper[4776]: I1204 10:02:22.501981 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea","Type":"ContainerStarted","Data":"e5290ae65ae282a37caf07ff6474d3a5397f2d4670da991975a84d748530ea1f"} Dec 04 10:02:23 crc kubenswrapper[4776]: I1204 10:02:23.463815 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82262e7e-617d-4583-b2d7-db358f9574f7" path="/var/lib/kubelet/pods/82262e7e-617d-4583-b2d7-db358f9574f7/volumes" Dec 04 10:02:23 crc kubenswrapper[4776]: I1204 10:02:23.515754 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea","Type":"ContainerStarted","Data":"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39"} Dec 04 10:02:23 crc kubenswrapper[4776]: I1204 10:02:23.515813 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea","Type":"ContainerStarted","Data":"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf"} Dec 04 10:02:23 crc kubenswrapper[4776]: I1204 10:02:23.541576 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.54155446 podStartE2EDuration="2.54155446s" podCreationTimestamp="2025-12-04 10:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:23.536318736 +0000 UTC m=+1388.402799113" watchObservedRunningTime="2025-12-04 10:02:23.54155446 +0000 UTC m=+1388.408034837" Dec 04 10:02:24 crc kubenswrapper[4776]: I1204 10:02:24.541430 4776 generic.go:334] "Generic (PLEG): container finished" podID="9ae6f2de-abd7-4410-b165-82a134e89e93" containerID="f0eb881180561bed3f783cfd79feaeae18711639ead36828bc866e32475aee2b" exitCode=0 Dec 04 10:02:24 crc kubenswrapper[4776]: I1204 10:02:24.541473 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9cq9" event={"ID":"9ae6f2de-abd7-4410-b165-82a134e89e93","Type":"ContainerDied","Data":"f0eb881180561bed3f783cfd79feaeae18711639ead36828bc866e32475aee2b"} Dec 04 10:02:24 crc kubenswrapper[4776]: I1204 10:02:24.810383 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.013568 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.013657 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.044746 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.044812 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.046895 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.084314 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.119402 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7lwkn"] Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.119703 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerName="dnsmasq-dns" containerID="cri-o://0e97f76ef4550100d2b4b29faa12d8e727c7c0d7e889b30a612362b823b1f06d" gracePeriod=10 Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.378885 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.557898 4776 generic.go:334] "Generic (PLEG): container finished" podID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerID="0e97f76ef4550100d2b4b29faa12d8e727c7c0d7e889b30a612362b823b1f06d" exitCode=0 Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.557985 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" event={"ID":"0b3b18f8-cccb-4159-a2d5-19f75959e6da","Type":"ContainerDied","Data":"0e97f76ef4550100d2b4b29faa12d8e727c7c0d7e889b30a612362b823b1f06d"} Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.559559 4776 generic.go:334] "Generic (PLEG): container finished" podID="d9ea013b-0a00-4108-9e20-ac957fdbf524" containerID="67ee3f123aa0b70b77f3a9940f957cd1fde7c13804b6ff9f1b7add78f5ab4540" exitCode=0 Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.559658 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" event={"ID":"d9ea013b-0a00-4108-9e20-ac957fdbf524","Type":"ContainerDied","Data":"67ee3f123aa0b70b77f3a9940f957cd1fde7c13804b6ff9f1b7add78f5ab4540"} Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.639431 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.743930 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.828698 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-sb\") pod \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.828867 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-config\") pod \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.829051 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc4s9\" (UniqueName: \"kubernetes.io/projected/0b3b18f8-cccb-4159-a2d5-19f75959e6da-kube-api-access-pc4s9\") pod \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.829110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-nb\") pod \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.829146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-dns-svc\") pod \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\" (UID: \"0b3b18f8-cccb-4159-a2d5-19f75959e6da\") " Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.878439 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3b18f8-cccb-4159-a2d5-19f75959e6da-kube-api-access-pc4s9" (OuterVolumeSpecName: "kube-api-access-pc4s9") pod "0b3b18f8-cccb-4159-a2d5-19f75959e6da" (UID: "0b3b18f8-cccb-4159-a2d5-19f75959e6da"). InnerVolumeSpecName "kube-api-access-pc4s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.917029 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b3b18f8-cccb-4159-a2d5-19f75959e6da" (UID: "0b3b18f8-cccb-4159-a2d5-19f75959e6da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.932969 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc4s9\" (UniqueName: \"kubernetes.io/projected/0b3b18f8-cccb-4159-a2d5-19f75959e6da-kube-api-access-pc4s9\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.933009 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.960640 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b3b18f8-cccb-4159-a2d5-19f75959e6da" (UID: "0b3b18f8-cccb-4159-a2d5-19f75959e6da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.968261 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.975820 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b3b18f8-cccb-4159-a2d5-19f75959e6da" (UID: "0b3b18f8-cccb-4159-a2d5-19f75959e6da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:25 crc kubenswrapper[4776]: I1204 10:02:25.979712 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-config" (OuterVolumeSpecName: "config") pod "0b3b18f8-cccb-4159-a2d5-19f75959e6da" (UID: "0b3b18f8-cccb-4159-a2d5-19f75959e6da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.035319 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.035388 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.035403 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b18f8-cccb-4159-a2d5-19f75959e6da-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.096228 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.171:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.096411 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.171:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.136159 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-combined-ca-bundle\") pod \"9ae6f2de-abd7-4410-b165-82a134e89e93\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.136435 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-config-data\") pod \"9ae6f2de-abd7-4410-b165-82a134e89e93\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.137032 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brkxp\" (UniqueName: \"kubernetes.io/projected/9ae6f2de-abd7-4410-b165-82a134e89e93-kube-api-access-brkxp\") pod \"9ae6f2de-abd7-4410-b165-82a134e89e93\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.137222 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-scripts\") pod \"9ae6f2de-abd7-4410-b165-82a134e89e93\" (UID: \"9ae6f2de-abd7-4410-b165-82a134e89e93\") " Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.140503 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-scripts" (OuterVolumeSpecName: "scripts") pod "9ae6f2de-abd7-4410-b165-82a134e89e93" (UID: "9ae6f2de-abd7-4410-b165-82a134e89e93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.140506 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae6f2de-abd7-4410-b165-82a134e89e93-kube-api-access-brkxp" (OuterVolumeSpecName: "kube-api-access-brkxp") pod "9ae6f2de-abd7-4410-b165-82a134e89e93" (UID: "9ae6f2de-abd7-4410-b165-82a134e89e93"). InnerVolumeSpecName "kube-api-access-brkxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.163657 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-config-data" (OuterVolumeSpecName: "config-data") pod "9ae6f2de-abd7-4410-b165-82a134e89e93" (UID: "9ae6f2de-abd7-4410-b165-82a134e89e93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.170356 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ae6f2de-abd7-4410-b165-82a134e89e93" (UID: "9ae6f2de-abd7-4410-b165-82a134e89e93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.240387 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.240439 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.240450 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brkxp\" (UniqueName: \"kubernetes.io/projected/9ae6f2de-abd7-4410-b165-82a134e89e93-kube-api-access-brkxp\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.240464 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae6f2de-abd7-4410-b165-82a134e89e93-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.572882 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9cq9" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.572876 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9cq9" event={"ID":"9ae6f2de-abd7-4410-b165-82a134e89e93","Type":"ContainerDied","Data":"b27aab6dd359f6664667284bddb120c6623e10e30972a6018c10d96975999ccd"} Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.573455 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27aab6dd359f6664667284bddb120c6623e10e30972a6018c10d96975999ccd" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.575433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" event={"ID":"0b3b18f8-cccb-4159-a2d5-19f75959e6da","Type":"ContainerDied","Data":"34918026f87ff7cec4c7949478a4fd3d9279a5f075e2885ee7834d063b5abb3b"} Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.575536 4776 scope.go:117] "RemoveContainer" containerID="0e97f76ef4550100d2b4b29faa12d8e727c7c0d7e889b30a612362b823b1f06d" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.575566 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7lwkn" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.629901 4776 scope.go:117] "RemoveContainer" containerID="9c334638aab625c2458d38d75d5016e4113a05bb82e5e514f14b63b53b658bc1" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.652516 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7lwkn"] Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.662997 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7lwkn"] Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.768348 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.768624 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-log" containerID="cri-o://76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27" gracePeriod=30 Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.768924 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-api" containerID="cri-o://3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d" gracePeriod=30 Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.799772 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.811942 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.812253 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-log" containerID="cri-o://db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf" gracePeriod=30 Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.812463 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-metadata" containerID="cri-o://7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39" gracePeriod=30 Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.881240 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:02:26 crc kubenswrapper[4776]: I1204 10:02:26.881314 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.088629 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.157587 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-config-data\") pod \"d9ea013b-0a00-4108-9e20-ac957fdbf524\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.157778 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfchv\" (UniqueName: \"kubernetes.io/projected/d9ea013b-0a00-4108-9e20-ac957fdbf524-kube-api-access-lfchv\") pod \"d9ea013b-0a00-4108-9e20-ac957fdbf524\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.157824 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-combined-ca-bundle\") pod \"d9ea013b-0a00-4108-9e20-ac957fdbf524\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.158822 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-scripts\") pod \"d9ea013b-0a00-4108-9e20-ac957fdbf524\" (UID: \"d9ea013b-0a00-4108-9e20-ac957fdbf524\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.166125 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-scripts" (OuterVolumeSpecName: "scripts") pod "d9ea013b-0a00-4108-9e20-ac957fdbf524" (UID: "d9ea013b-0a00-4108-9e20-ac957fdbf524"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.166332 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ea013b-0a00-4108-9e20-ac957fdbf524-kube-api-access-lfchv" (OuterVolumeSpecName: "kube-api-access-lfchv") pod "d9ea013b-0a00-4108-9e20-ac957fdbf524" (UID: "d9ea013b-0a00-4108-9e20-ac957fdbf524"). InnerVolumeSpecName "kube-api-access-lfchv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.189485 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9ea013b-0a00-4108-9e20-ac957fdbf524" (UID: "d9ea013b-0a00-4108-9e20-ac957fdbf524"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.249320 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-config-data" (OuterVolumeSpecName: "config-data") pod "d9ea013b-0a00-4108-9e20-ac957fdbf524" (UID: "d9ea013b-0a00-4108-9e20-ac957fdbf524"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.261166 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.261196 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.261207 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfchv\" (UniqueName: \"kubernetes.io/projected/d9ea013b-0a00-4108-9e20-ac957fdbf524-kube-api-access-lfchv\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.261218 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ea013b-0a00-4108-9e20-ac957fdbf524-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.469388 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" path="/var/lib/kubelet/pods/0b3b18f8-cccb-4159-a2d5-19f75959e6da/volumes" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.489480 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.566340 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-config-data\") pod \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.566494 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-logs\") pod \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.566767 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-nova-metadata-tls-certs\") pod \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.566834 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx8t7\" (UniqueName: \"kubernetes.io/projected/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-kube-api-access-lx8t7\") pod \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.566941 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-combined-ca-bundle\") pod \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\" (UID: \"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea\") " Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.573339 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-logs" (OuterVolumeSpecName: "logs") pod "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" (UID: "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.582572 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-kube-api-access-lx8t7" (OuterVolumeSpecName: "kube-api-access-lx8t7") pod "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" (UID: "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea"). InnerVolumeSpecName "kube-api-access-lx8t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.645391 4776 generic.go:334] "Generic (PLEG): container finished" podID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerID="76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27" exitCode=143 Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.645828 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2157cbd-6aa5-4b54-995e-c528d86c3f83","Type":"ContainerDied","Data":"76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27"} Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.656943 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" (UID: "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.669128 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-config-data" (OuterVolumeSpecName: "config-data") pod "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" (UID: "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.671199 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx8t7\" (UniqueName: \"kubernetes.io/projected/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-kube-api-access-lx8t7\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.671224 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.671235 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.671244 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.675602 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.675593 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wpwrq" event={"ID":"d9ea013b-0a00-4108-9e20-ac957fdbf524","Type":"ContainerDied","Data":"d50b2b2e04112d8c49e96154e285f04f79d8906eefb1b0be5b52977a9c781aab"} Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.676819 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50b2b2e04112d8c49e96154e285f04f79d8906eefb1b0be5b52977a9c781aab" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.679854 4776 generic.go:334] "Generic (PLEG): container finished" podID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerID="7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39" exitCode=0 Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.679898 4776 generic.go:334] "Generic (PLEG): container finished" podID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerID="db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf" exitCode=143 Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.679974 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea","Type":"ContainerDied","Data":"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39"} Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.680023 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea","Type":"ContainerDied","Data":"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf"} Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.680041 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f631b97e-3bd5-4c0f-b5ec-ad18831e9aea","Type":"ContainerDied","Data":"e5290ae65ae282a37caf07ff6474d3a5397f2d4670da991975a84d748530ea1f"} Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.680068 4776 scope.go:117] "RemoveContainer" containerID="7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.680217 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.688756 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" containerName="nova-scheduler-scheduler" containerID="cri-o://11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" gracePeriod=30 Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.735229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" (UID: "f631b97e-3bd5-4c0f-b5ec-ad18831e9aea"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.744362 4776 scope.go:117] "RemoveContainer" containerID="db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.779056 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.785455 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.806543 4776 scope.go:117] "RemoveContainer" containerID="7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.808674 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39\": container with ID starting with 7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39 not found: ID does not exist" containerID="7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.811646 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39"} err="failed to get container status \"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39\": rpc error: code = NotFound desc = could not find container \"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39\": container with ID starting with 7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39 not found: ID does not exist" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.811866 4776 scope.go:117] "RemoveContainer" containerID="db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.816780 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf\": container with ID starting with db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf not found: ID does not exist" containerID="db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.817568 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf"} err="failed to get container status \"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf\": rpc error: code = NotFound desc = could not find container \"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf\": container with ID starting with db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf not found: ID does not exist" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.817789 4776 scope.go:117] "RemoveContainer" containerID="7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.820651 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39"} err="failed to get container status \"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39\": rpc error: code = NotFound desc = could not find container \"7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39\": container with ID starting with 7ade40dddd4401e82966d275a00f3bcc9cf0e3b9777e26be0e62e11064808e39 not found: ID does not exist" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.822016 4776 scope.go:117] "RemoveContainer" containerID="db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.825587 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf"} err="failed to get container status \"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf\": rpc error: code = NotFound desc = could not find container \"db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf\": container with ID starting with db34a2b4a5027cb37c7b7974eb1d06be15afc9ebb76745f7170a4a9cd3df05cf not found: ID does not exist" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.842611 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ea013b-0a00-4108-9e20-ac957fdbf524" containerName="nova-cell1-conductor-db-sync" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.842671 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ea013b-0a00-4108-9e20-ac957fdbf524" containerName="nova-cell1-conductor-db-sync" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.842719 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerName="init" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.842728 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerName="init" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.842759 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-metadata" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.842766 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-metadata" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.842785 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerName="dnsmasq-dns" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.842836 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerName="dnsmasq-dns" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.842856 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-log" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.842864 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-log" Dec 04 10:02:27 crc kubenswrapper[4776]: E1204 10:02:27.842883 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae6f2de-abd7-4410-b165-82a134e89e93" containerName="nova-manage" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.842890 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae6f2de-abd7-4410-b165-82a134e89e93" containerName="nova-manage" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.843586 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ea013b-0a00-4108-9e20-ac957fdbf524" containerName="nova-cell1-conductor-db-sync" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.843610 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3b18f8-cccb-4159-a2d5-19f75959e6da" containerName="dnsmasq-dns" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.843632 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae6f2de-abd7-4410-b165-82a134e89e93" containerName="nova-manage" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.843646 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-metadata" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.843657 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" containerName="nova-metadata-log" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.844869 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.847948 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.851792 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.994293 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb549ab7-99fa-4631-b0b5-d4a029e7de33-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.994385 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb549ab7-99fa-4631-b0b5-d4a029e7de33-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:27 crc kubenswrapper[4776]: I1204 10:02:27.994452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlst\" (UniqueName: \"kubernetes.io/projected/eb549ab7-99fa-4631-b0b5-d4a029e7de33-kube-api-access-ndlst\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.035764 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.061330 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.072664 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.075045 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.080141 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.080392 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.087690 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.096031 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb549ab7-99fa-4631-b0b5-d4a029e7de33-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.096096 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlst\" (UniqueName: \"kubernetes.io/projected/eb549ab7-99fa-4631-b0b5-d4a029e7de33-kube-api-access-ndlst\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.096241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb549ab7-99fa-4631-b0b5-d4a029e7de33-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.107962 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb549ab7-99fa-4631-b0b5-d4a029e7de33-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.112950 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb549ab7-99fa-4631-b0b5-d4a029e7de33-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.120507 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlst\" (UniqueName: \"kubernetes.io/projected/eb549ab7-99fa-4631-b0b5-d4a029e7de33-kube-api-access-ndlst\") pod \"nova-cell1-conductor-0\" (UID: \"eb549ab7-99fa-4631-b0b5-d4a029e7de33\") " pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.198714 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97af0008-c8e6-4448-99c1-1a465bd92ac9-logs\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.198774 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.199139 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgcmz\" (UniqueName: \"kubernetes.io/projected/97af0008-c8e6-4448-99c1-1a465bd92ac9-kube-api-access-zgcmz\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.199260 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-config-data\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.199501 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.212813 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.304275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgcmz\" (UniqueName: \"kubernetes.io/projected/97af0008-c8e6-4448-99c1-1a465bd92ac9-kube-api-access-zgcmz\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.304731 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-config-data\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.304822 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.304879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97af0008-c8e6-4448-99c1-1a465bd92ac9-logs\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.304908 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.309978 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97af0008-c8e6-4448-99c1-1a465bd92ac9-logs\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.312639 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.321548 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-config-data\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.327603 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgcmz\" (UniqueName: \"kubernetes.io/projected/97af0008-c8e6-4448-99c1-1a465bd92ac9-kube-api-access-zgcmz\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.332989 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.415652 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:02:28 crc kubenswrapper[4776]: I1204 10:02:28.887853 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.055409 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.391720 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.392283 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f57cb5d1-1baa-4fc7-8c71-16d1138dab82" containerName="kube-state-metrics" containerID="cri-o://adef3fbc5f17f46ef92e49977033fc192ebb6059ac071131f71e1cf991a17761" gracePeriod=30 Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.466359 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f631b97e-3bd5-4c0f-b5ec-ad18831e9aea" path="/var/lib/kubelet/pods/f631b97e-3bd5-4c0f-b5ec-ad18831e9aea/volumes" Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.742238 4776 generic.go:334] "Generic (PLEG): container finished" podID="f57cb5d1-1baa-4fc7-8c71-16d1138dab82" containerID="adef3fbc5f17f46ef92e49977033fc192ebb6059ac071131f71e1cf991a17761" exitCode=2 Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.742752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f57cb5d1-1baa-4fc7-8c71-16d1138dab82","Type":"ContainerDied","Data":"adef3fbc5f17f46ef92e49977033fc192ebb6059ac071131f71e1cf991a17761"} Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.769855 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"97af0008-c8e6-4448-99c1-1a465bd92ac9","Type":"ContainerStarted","Data":"5f96f7f2d202753560fcf2a08a80ca0ca2eeff02be1d727aabc5620f3c969d51"} Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.769980 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"97af0008-c8e6-4448-99c1-1a465bd92ac9","Type":"ContainerStarted","Data":"5b5d9466438db62721cb4e543cab2c3ca1c0066bbe0bc9376a1a0ea7cc7ae6f4"} Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.769997 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"97af0008-c8e6-4448-99c1-1a465bd92ac9","Type":"ContainerStarted","Data":"fa27d66a40bc1a31f5c4a8a3d0ecf53ac145cb03a76a375d10283b51637be45f"} Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.773024 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eb549ab7-99fa-4631-b0b5-d4a029e7de33","Type":"ContainerStarted","Data":"b308a5331b18033cc3843c507db04752a18b33b9becb6d105fe6221ef2ef761d"} Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.773079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eb549ab7-99fa-4631-b0b5-d4a029e7de33","Type":"ContainerStarted","Data":"a8606b0ff6c0da0ba26f597afbed780dfab90408335f24ec5e8a0aeaa0c87a2a"} Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.774133 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.811482 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.811454247 podStartE2EDuration="1.811454247s" podCreationTimestamp="2025-12-04 10:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:29.804420767 +0000 UTC m=+1394.670901164" watchObservedRunningTime="2025-12-04 10:02:29.811454247 +0000 UTC m=+1394.677934624" Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.830594 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8305605849999997 podStartE2EDuration="2.830560585s" podCreationTimestamp="2025-12-04 10:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:29.828195882 +0000 UTC m=+1394.694676269" watchObservedRunningTime="2025-12-04 10:02:29.830560585 +0000 UTC m=+1394.697040962" Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.891394 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.962042 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvcsc\" (UniqueName: \"kubernetes.io/projected/f57cb5d1-1baa-4fc7-8c71-16d1138dab82-kube-api-access-lvcsc\") pod \"f57cb5d1-1baa-4fc7-8c71-16d1138dab82\" (UID: \"f57cb5d1-1baa-4fc7-8c71-16d1138dab82\") " Dec 04 10:02:29 crc kubenswrapper[4776]: I1204 10:02:29.971719 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57cb5d1-1baa-4fc7-8c71-16d1138dab82-kube-api-access-lvcsc" (OuterVolumeSpecName: "kube-api-access-lvcsc") pod "f57cb5d1-1baa-4fc7-8c71-16d1138dab82" (UID: "f57cb5d1-1baa-4fc7-8c71-16d1138dab82"). InnerVolumeSpecName "kube-api-access-lvcsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:30 crc kubenswrapper[4776]: E1204 10:02:30.047871 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:02:30 crc kubenswrapper[4776]: E1204 10:02:30.053008 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:02:30 crc kubenswrapper[4776]: E1204 10:02:30.055182 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:02:30 crc kubenswrapper[4776]: E1204 10:02:30.055233 4776 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" containerName="nova-scheduler-scheduler" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.065574 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvcsc\" (UniqueName: \"kubernetes.io/projected/f57cb5d1-1baa-4fc7-8c71-16d1138dab82-kube-api-access-lvcsc\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.508131 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.508526 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-central-agent" containerID="cri-o://c79c4bde16c46d67255fd17da9687de91ca8d6b376f9f7084c9d1fdb22660815" gracePeriod=30 Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.508609 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="proxy-httpd" containerID="cri-o://b669ea7a1befe4f0f3848a9eafa4fd1f6de1991f2555d517d3510c0d0e3e6bce" gracePeriod=30 Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.508647 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="sg-core" containerID="cri-o://25700ead5b3aab6a01af98c4dbe8918c0a2a8497cae47ac55f04022a90a2651a" gracePeriod=30 Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.508955 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-notification-agent" containerID="cri-o://39cf2f4a517dac058be8aa8acaf7c5c3bd75315b8dda4b3ff854eb810933bf36" gracePeriod=30 Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.787376 4776 generic.go:334] "Generic (PLEG): container finished" podID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerID="b669ea7a1befe4f0f3848a9eafa4fd1f6de1991f2555d517d3510c0d0e3e6bce" exitCode=0 Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.787421 4776 generic.go:334] "Generic (PLEG): container finished" podID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerID="25700ead5b3aab6a01af98c4dbe8918c0a2a8497cae47ac55f04022a90a2651a" exitCode=2 Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.787474 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerDied","Data":"b669ea7a1befe4f0f3848a9eafa4fd1f6de1991f2555d517d3510c0d0e3e6bce"} Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.787507 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerDied","Data":"25700ead5b3aab6a01af98c4dbe8918c0a2a8497cae47ac55f04022a90a2651a"} Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.793205 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.795278 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f57cb5d1-1baa-4fc7-8c71-16d1138dab82","Type":"ContainerDied","Data":"176599bab83495559a072732d4606c07a81028607f92f33808c416b4de9e977c"} Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.795336 4776 scope.go:117] "RemoveContainer" containerID="adef3fbc5f17f46ef92e49977033fc192ebb6059ac071131f71e1cf991a17761" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.865954 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.879990 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.891232 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:30 crc kubenswrapper[4776]: E1204 10:02:30.891603 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57cb5d1-1baa-4fc7-8c71-16d1138dab82" containerName="kube-state-metrics" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.891624 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57cb5d1-1baa-4fc7-8c71-16d1138dab82" containerName="kube-state-metrics" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.891854 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57cb5d1-1baa-4fc7-8c71-16d1138dab82" containerName="kube-state-metrics" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.892518 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.895188 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.895461 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.910615 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.981379 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.981574 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.981620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67vk2\" (UniqueName: \"kubernetes.io/projected/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-api-access-67vk2\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:30 crc kubenswrapper[4776]: I1204 10:02:30.981645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.084467 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.084519 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67vk2\" (UniqueName: \"kubernetes.io/projected/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-api-access-67vk2\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.084539 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.084631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.093060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.096524 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.101648 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d3be34-5565-4f76-8afe-50df0f2a558f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.105246 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67vk2\" (UniqueName: \"kubernetes.io/projected/c2d3be34-5565-4f76-8afe-50df0f2a558f-kube-api-access-67vk2\") pod \"kube-state-metrics-0\" (UID: \"c2d3be34-5565-4f76-8afe-50df0f2a558f\") " pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.222572 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.466061 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57cb5d1-1baa-4fc7-8c71-16d1138dab82" path="/var/lib/kubelet/pods/f57cb5d1-1baa-4fc7-8c71-16d1138dab82/volumes" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.591128 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.701184 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-combined-ca-bundle\") pod \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.701271 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ttb6\" (UniqueName: \"kubernetes.io/projected/c11e0ae0-1bd1-45f9-9baa-08d33108d638-kube-api-access-2ttb6\") pod \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.701394 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-config-data\") pod \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\" (UID: \"c11e0ae0-1bd1-45f9-9baa-08d33108d638\") " Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.710167 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11e0ae0-1bd1-45f9-9baa-08d33108d638-kube-api-access-2ttb6" (OuterVolumeSpecName: "kube-api-access-2ttb6") pod "c11e0ae0-1bd1-45f9-9baa-08d33108d638" (UID: "c11e0ae0-1bd1-45f9-9baa-08d33108d638"). InnerVolumeSpecName "kube-api-access-2ttb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.730245 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c11e0ae0-1bd1-45f9-9baa-08d33108d638" (UID: "c11e0ae0-1bd1-45f9-9baa-08d33108d638"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.733712 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-config-data" (OuterVolumeSpecName: "config-data") pod "c11e0ae0-1bd1-45f9-9baa-08d33108d638" (UID: "c11e0ae0-1bd1-45f9-9baa-08d33108d638"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.790034 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.803236 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.803269 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ttb6\" (UniqueName: \"kubernetes.io/projected/c11e0ae0-1bd1-45f9-9baa-08d33108d638-kube-api-access-2ttb6\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.803279 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e0ae0-1bd1-45f9-9baa-08d33108d638-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.807202 4776 generic.go:334] "Generic (PLEG): container finished" podID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" exitCode=0 Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.807262 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.807315 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c11e0ae0-1bd1-45f9-9baa-08d33108d638","Type":"ContainerDied","Data":"11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8"} Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.807370 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c11e0ae0-1bd1-45f9-9baa-08d33108d638","Type":"ContainerDied","Data":"b466ca9f68302149011f544d0848b180b6a443d925fadfcddae6c47e82f82e00"} Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.807397 4776 scope.go:117] "RemoveContainer" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.811328 4776 generic.go:334] "Generic (PLEG): container finished" podID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerID="c79c4bde16c46d67255fd17da9687de91ca8d6b376f9f7084c9d1fdb22660815" exitCode=0 Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.811402 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerDied","Data":"c79c4bde16c46d67255fd17da9687de91ca8d6b376f9f7084c9d1fdb22660815"} Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.813217 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2d3be34-5565-4f76-8afe-50df0f2a558f","Type":"ContainerStarted","Data":"fdf5073ff55db03748d19a624ed4f229a5353fb425f41b999ccfc54f479dcef6"} Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.830710 4776 scope.go:117] "RemoveContainer" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" Dec 04 10:02:31 crc kubenswrapper[4776]: E1204 10:02:31.831338 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8\": container with ID starting with 11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8 not found: ID does not exist" containerID="11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.831382 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8"} err="failed to get container status \"11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8\": rpc error: code = NotFound desc = could not find container \"11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8\": container with ID starting with 11e574454d284e7d455595e2dde1f3d3a5285762cd4a1efe0e724bf4675e71e8 not found: ID does not exist" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.843073 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.852428 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.863715 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:31 crc kubenswrapper[4776]: E1204 10:02:31.864144 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" containerName="nova-scheduler-scheduler" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.864166 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" containerName="nova-scheduler-scheduler" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.864391 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" containerName="nova-scheduler-scheduler" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.865069 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.870716 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.876619 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.905559 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kncpk\" (UniqueName: \"kubernetes.io/projected/4de101ce-a0bf-4231-a572-a26b48e55f24-kube-api-access-kncpk\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.905624 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:31 crc kubenswrapper[4776]: I1204 10:02:31.905827 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-config-data\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.007299 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kncpk\" (UniqueName: \"kubernetes.io/projected/4de101ce-a0bf-4231-a572-a26b48e55f24-kube-api-access-kncpk\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.007357 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.007448 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-config-data\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.011207 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-config-data\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.012218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.025983 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kncpk\" (UniqueName: \"kubernetes.io/projected/4de101ce-a0bf-4231-a572-a26b48e55f24-kube-api-access-kncpk\") pod \"nova-scheduler-0\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.188929 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.675365 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:02:32 crc kubenswrapper[4776]: W1204 10:02:32.682826 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de101ce_a0bf_4231_a572_a26b48e55f24.slice/crio-e6659a70f65513d5e15f7187bc7c598dc85e2a14b2136dbfd3755b8b93efeb62 WatchSource:0}: Error finding container e6659a70f65513d5e15f7187bc7c598dc85e2a14b2136dbfd3755b8b93efeb62: Status 404 returned error can't find the container with id e6659a70f65513d5e15f7187bc7c598dc85e2a14b2136dbfd3755b8b93efeb62 Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.775253 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.821661 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-combined-ca-bundle\") pod \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.821755 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2157cbd-6aa5-4b54-995e-c528d86c3f83-logs\") pod \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.821798 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfw6t\" (UniqueName: \"kubernetes.io/projected/a2157cbd-6aa5-4b54-995e-c528d86c3f83-kube-api-access-tfw6t\") pod \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.821869 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-config-data\") pod \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\" (UID: \"a2157cbd-6aa5-4b54-995e-c528d86c3f83\") " Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.822595 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2157cbd-6aa5-4b54-995e-c528d86c3f83-logs" (OuterVolumeSpecName: "logs") pod "a2157cbd-6aa5-4b54-995e-c528d86c3f83" (UID: "a2157cbd-6aa5-4b54-995e-c528d86c3f83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.827088 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2157cbd-6aa5-4b54-995e-c528d86c3f83-kube-api-access-tfw6t" (OuterVolumeSpecName: "kube-api-access-tfw6t") pod "a2157cbd-6aa5-4b54-995e-c528d86c3f83" (UID: "a2157cbd-6aa5-4b54-995e-c528d86c3f83"). InnerVolumeSpecName "kube-api-access-tfw6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.830213 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4de101ce-a0bf-4231-a572-a26b48e55f24","Type":"ContainerStarted","Data":"e6659a70f65513d5e15f7187bc7c598dc85e2a14b2136dbfd3755b8b93efeb62"} Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.834827 4776 generic.go:334] "Generic (PLEG): container finished" podID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerID="3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d" exitCode=0 Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.834941 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2157cbd-6aa5-4b54-995e-c528d86c3f83","Type":"ContainerDied","Data":"3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d"} Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.834983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2157cbd-6aa5-4b54-995e-c528d86c3f83","Type":"ContainerDied","Data":"fc823422d28dc34c6eb288b324ad449770d4e17dddc73fc5fde9ce76a826742a"} Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.834979 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.835006 4776 scope.go:117] "RemoveContainer" containerID="3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.843959 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c2d3be34-5565-4f76-8afe-50df0f2a558f","Type":"ContainerStarted","Data":"71e3f5763504be1110c2d437788e51199ab8aff9a34fe92f967319ff8ce9f064"} Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.844120 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.875692 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-config-data" (OuterVolumeSpecName: "config-data") pod "a2157cbd-6aa5-4b54-995e-c528d86c3f83" (UID: "a2157cbd-6aa5-4b54-995e-c528d86c3f83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.879202 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2157cbd-6aa5-4b54-995e-c528d86c3f83" (UID: "a2157cbd-6aa5-4b54-995e-c528d86c3f83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.880467 4776 scope.go:117] "RemoveContainer" containerID="76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.886388 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.531643312 podStartE2EDuration="2.886341169s" podCreationTimestamp="2025-12-04 10:02:30 +0000 UTC" firstStartedPulling="2025-12-04 10:02:31.797436813 +0000 UTC m=+1396.663917190" lastFinishedPulling="2025-12-04 10:02:32.15213467 +0000 UTC m=+1397.018615047" observedRunningTime="2025-12-04 10:02:32.866452457 +0000 UTC m=+1397.732932864" watchObservedRunningTime="2025-12-04 10:02:32.886341169 +0000 UTC m=+1397.752821546" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.912993 4776 scope.go:117] "RemoveContainer" containerID="3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d" Dec 04 10:02:32 crc kubenswrapper[4776]: E1204 10:02:32.913878 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d\": container with ID starting with 3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d not found: ID does not exist" containerID="3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.913932 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d"} err="failed to get container status \"3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d\": rpc error: code = NotFound desc = could not find container \"3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d\": container with ID starting with 3cfb8c846582a82da6f7d8caf0d703600433a3d8d90661bdabe91c58f38e972d not found: ID does not exist" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.913971 4776 scope.go:117] "RemoveContainer" containerID="76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27" Dec 04 10:02:32 crc kubenswrapper[4776]: E1204 10:02:32.914599 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27\": container with ID starting with 76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27 not found: ID does not exist" containerID="76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.914638 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27"} err="failed to get container status \"76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27\": rpc error: code = NotFound desc = could not find container \"76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27\": container with ID starting with 76a9b7fc47b9de35c7efe796c7e9f5af3eeb8a339caccb1e540eed64d5740c27 not found: ID does not exist" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.924460 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.924498 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2157cbd-6aa5-4b54-995e-c528d86c3f83-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.924509 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfw6t\" (UniqueName: \"kubernetes.io/projected/a2157cbd-6aa5-4b54-995e-c528d86c3f83-kube-api-access-tfw6t\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:32 crc kubenswrapper[4776]: I1204 10:02:32.924519 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2157cbd-6aa5-4b54-995e-c528d86c3f83-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.179892 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.194969 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.213455 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:33 crc kubenswrapper[4776]: E1204 10:02:33.213843 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-log" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.213862 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-log" Dec 04 10:02:33 crc kubenswrapper[4776]: E1204 10:02:33.213892 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-api" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.213899 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-api" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.214078 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-log" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.214099 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" containerName="nova-api-api" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.215059 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.217515 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.224620 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.232755 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29811233-5318-47b6-8145-9057d38506f8-logs\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.232819 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.232910 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265fx\" (UniqueName: \"kubernetes.io/projected/29811233-5318-47b6-8145-9057d38506f8-kube-api-access-265fx\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.232983 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-config-data\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.335234 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29811233-5318-47b6-8145-9057d38506f8-logs\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.335374 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.335535 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265fx\" (UniqueName: \"kubernetes.io/projected/29811233-5318-47b6-8145-9057d38506f8-kube-api-access-265fx\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.335709 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29811233-5318-47b6-8145-9057d38506f8-logs\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.335726 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-config-data\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.339986 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.340275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-config-data\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.358907 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265fx\" (UniqueName: \"kubernetes.io/projected/29811233-5318-47b6-8145-9057d38506f8-kube-api-access-265fx\") pod \"nova-api-0\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.416558 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.416683 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.464730 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2157cbd-6aa5-4b54-995e-c528d86c3f83" path="/var/lib/kubelet/pods/a2157cbd-6aa5-4b54-995e-c528d86c3f83/volumes" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.465361 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11e0ae0-1bd1-45f9-9baa-08d33108d638" path="/var/lib/kubelet/pods/c11e0ae0-1bd1-45f9-9baa-08d33108d638/volumes" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.536758 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.860249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4de101ce-a0bf-4231-a572-a26b48e55f24","Type":"ContainerStarted","Data":"df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484"} Dec 04 10:02:33 crc kubenswrapper[4776]: I1204 10:02:33.992563 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.992545738 podStartE2EDuration="2.992545738s" podCreationTimestamp="2025-12-04 10:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:33.880462998 +0000 UTC m=+1398.746943395" watchObservedRunningTime="2025-12-04 10:02:33.992545738 +0000 UTC m=+1398.859026115" Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:33.999733 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:34 crc kubenswrapper[4776]: W1204 10:02:34.002620 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29811233_5318_47b6_8145_9057d38506f8.slice/crio-f5cf26ccc4b958b86cbac13df9189a9446f05f7e1da12485504328915c04ad34 WatchSource:0}: Error finding container f5cf26ccc4b958b86cbac13df9189a9446f05f7e1da12485504328915c04ad34: Status 404 returned error can't find the container with id f5cf26ccc4b958b86cbac13df9189a9446f05f7e1da12485504328915c04ad34 Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:34.895401 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29811233-5318-47b6-8145-9057d38506f8","Type":"ContainerStarted","Data":"6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6"} Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:34.896046 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29811233-5318-47b6-8145-9057d38506f8","Type":"ContainerStarted","Data":"beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33"} Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:34.896063 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29811233-5318-47b6-8145-9057d38506f8","Type":"ContainerStarted","Data":"f5cf26ccc4b958b86cbac13df9189a9446f05f7e1da12485504328915c04ad34"} Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:34.927093 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9270647300000001 podStartE2EDuration="1.92706473s" podCreationTimestamp="2025-12-04 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:34.926170442 +0000 UTC m=+1399.792650819" watchObservedRunningTime="2025-12-04 10:02:34.92706473 +0000 UTC m=+1399.793545097" Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:34.928282 4776 generic.go:334] "Generic (PLEG): container finished" podID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerID="39cf2f4a517dac058be8aa8acaf7c5c3bd75315b8dda4b3ff854eb810933bf36" exitCode=0 Dec 04 10:02:34 crc kubenswrapper[4776]: I1204 10:02:34.929006 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerDied","Data":"39cf2f4a517dac058be8aa8acaf7c5c3bd75315b8dda4b3ff854eb810933bf36"} Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.092556 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193050 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-sg-core-conf-yaml\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193094 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-run-httpd\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193145 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-combined-ca-bundle\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193212 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-scripts\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193288 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-config-data\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193354 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-log-httpd\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.193374 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxjx8\" (UniqueName: \"kubernetes.io/projected/65ec0795-dc16-460c-8307-0b5864ff9a59-kube-api-access-qxjx8\") pod \"65ec0795-dc16-460c-8307-0b5864ff9a59\" (UID: \"65ec0795-dc16-460c-8307-0b5864ff9a59\") " Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.194097 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.194610 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.216362 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ec0795-dc16-460c-8307-0b5864ff9a59-kube-api-access-qxjx8" (OuterVolumeSpecName: "kube-api-access-qxjx8") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "kube-api-access-qxjx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.217118 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-scripts" (OuterVolumeSpecName: "scripts") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.281497 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.295674 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.295768 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxjx8\" (UniqueName: \"kubernetes.io/projected/65ec0795-dc16-460c-8307-0b5864ff9a59-kube-api-access-qxjx8\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.295792 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.295803 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ec0795-dc16-460c-8307-0b5864ff9a59-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.295817 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.315164 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.337536 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-config-data" (OuterVolumeSpecName: "config-data") pod "65ec0795-dc16-460c-8307-0b5864ff9a59" (UID: "65ec0795-dc16-460c-8307-0b5864ff9a59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.404209 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.404261 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ec0795-dc16-460c-8307-0b5864ff9a59-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.941599 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ec0795-dc16-460c-8307-0b5864ff9a59","Type":"ContainerDied","Data":"623d31482192ab0496706da2897e895c8a5e98865b92f255b9fb85a18e7b6699"} Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.941630 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.941893 4776 scope.go:117] "RemoveContainer" containerID="b669ea7a1befe4f0f3848a9eafa4fd1f6de1991f2555d517d3510c0d0e3e6bce" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.974906 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.977239 4776 scope.go:117] "RemoveContainer" containerID="25700ead5b3aab6a01af98c4dbe8918c0a2a8497cae47ac55f04022a90a2651a" Dec 04 10:02:35 crc kubenswrapper[4776]: I1204 10:02:35.989163 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.001828 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:36 crc kubenswrapper[4776]: E1204 10:02:36.002572 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-notification-agent" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.002605 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-notification-agent" Dec 04 10:02:36 crc kubenswrapper[4776]: E1204 10:02:36.002652 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-central-agent" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.002662 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-central-agent" Dec 04 10:02:36 crc kubenswrapper[4776]: E1204 10:02:36.002681 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="sg-core" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.002689 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="sg-core" Dec 04 10:02:36 crc kubenswrapper[4776]: E1204 10:02:36.002712 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="proxy-httpd" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.002722 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="proxy-httpd" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.002987 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="proxy-httpd" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.003062 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-notification-agent" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.003078 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="ceilometer-central-agent" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.003089 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" containerName="sg-core" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.005801 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.008840 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.008932 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.009103 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.016023 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.021195 4776 scope.go:117] "RemoveContainer" containerID="39cf2f4a517dac058be8aa8acaf7c5c3bd75315b8dda4b3ff854eb810933bf36" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.057130 4776 scope.go:117] "RemoveContainer" containerID="c79c4bde16c46d67255fd17da9687de91ca8d6b376f9f7084c9d1fdb22660815" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.122250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-config-data\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.122314 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.122491 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.122873 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.123106 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-scripts\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.123423 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.123559 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.123626 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87x77\" (UniqueName: \"kubernetes.io/projected/192810e5-d32d-46e6-9296-9060df1a6b5c-kube-api-access-87x77\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.226639 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227230 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87x77\" (UniqueName: \"kubernetes.io/projected/192810e5-d32d-46e6-9296-9060df1a6b5c-kube-api-access-87x77\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227459 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-config-data\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227642 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227737 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227811 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-scripts\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.227233 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.229429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.232088 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-scripts\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.232790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.232953 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.232097 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.235694 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-config-data\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.256880 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87x77\" (UniqueName: \"kubernetes.io/projected/192810e5-d32d-46e6-9296-9060df1a6b5c-kube-api-access-87x77\") pod \"ceilometer-0\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.336599 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.793631 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:36 crc kubenswrapper[4776]: W1204 10:02:36.802748 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod192810e5_d32d_46e6_9296_9060df1a6b5c.slice/crio-424efd1773a085a4d2f219f4dbb5fed107241dc2828c7b2ec31c9763544a395a WatchSource:0}: Error finding container 424efd1773a085a4d2f219f4dbb5fed107241dc2828c7b2ec31c9763544a395a: Status 404 returned error can't find the container with id 424efd1773a085a4d2f219f4dbb5fed107241dc2828c7b2ec31c9763544a395a Dec 04 10:02:36 crc kubenswrapper[4776]: I1204 10:02:36.953204 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerStarted","Data":"424efd1773a085a4d2f219f4dbb5fed107241dc2828c7b2ec31c9763544a395a"} Dec 04 10:02:37 crc kubenswrapper[4776]: I1204 10:02:37.189504 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:02:37 crc kubenswrapper[4776]: I1204 10:02:37.467597 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ec0795-dc16-460c-8307-0b5864ff9a59" path="/var/lib/kubelet/pods/65ec0795-dc16-460c-8307-0b5864ff9a59/volumes" Dec 04 10:02:37 crc kubenswrapper[4776]: I1204 10:02:37.967996 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerStarted","Data":"750d80ede973f7baf821f529046d5199274ecf5ed45ae16d0f9769398694187e"} Dec 04 10:02:38 crc kubenswrapper[4776]: I1204 10:02:38.241814 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 10:02:38 crc kubenswrapper[4776]: I1204 10:02:38.416729 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:02:38 crc kubenswrapper[4776]: I1204 10:02:38.416816 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:02:38 crc kubenswrapper[4776]: I1204 10:02:38.981663 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerStarted","Data":"8438854833ed832965bd81e2270334dc501c482ad4e734aba419d12f2745b800"} Dec 04 10:02:38 crc kubenswrapper[4776]: I1204 10:02:38.981709 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerStarted","Data":"d94138b8a0c74249c98290694c73f1dce1346e577b6fdf736b13d2eaccaa148b"} Dec 04 10:02:39 crc kubenswrapper[4776]: I1204 10:02:39.430201 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:02:39 crc kubenswrapper[4776]: I1204 10:02:39.430267 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:02:41 crc kubenswrapper[4776]: I1204 10:02:41.009737 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerStarted","Data":"be1ae40139ed9829dec88192d8855694041c6f0c8d8d6c161c1e426e08a76611"} Dec 04 10:02:41 crc kubenswrapper[4776]: I1204 10:02:41.011281 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:02:41 crc kubenswrapper[4776]: I1204 10:02:41.044037 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.121550058 podStartE2EDuration="6.044014327s" podCreationTimestamp="2025-12-04 10:02:35 +0000 UTC" firstStartedPulling="2025-12-04 10:02:36.805978724 +0000 UTC m=+1401.672459101" lastFinishedPulling="2025-12-04 10:02:39.728442993 +0000 UTC m=+1404.594923370" observedRunningTime="2025-12-04 10:02:41.040482168 +0000 UTC m=+1405.906962535" watchObservedRunningTime="2025-12-04 10:02:41.044014327 +0000 UTC m=+1405.910494704" Dec 04 10:02:41 crc kubenswrapper[4776]: I1204 10:02:41.231398 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 10:02:43 crc kubenswrapper[4776]: I1204 10:02:43.411247 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:02:43 crc kubenswrapper[4776]: I1204 10:02:43.512894 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:02:43 crc kubenswrapper[4776]: I1204 10:02:43.542607 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:02:43 crc kubenswrapper[4776]: I1204 10:02:43.544375 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:02:44 crc kubenswrapper[4776]: I1204 10:02:44.067830 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:02:44 crc kubenswrapper[4776]: I1204 10:02:44.624097 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:02:44 crc kubenswrapper[4776]: I1204 10:02:44.624147 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:02:48 crc kubenswrapper[4776]: I1204 10:02:48.421856 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:02:48 crc kubenswrapper[4776]: I1204 10:02:48.422346 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:02:48 crc kubenswrapper[4776]: I1204 10:02:48.427036 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:02:48 crc kubenswrapper[4776]: I1204 10:02:48.428965 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:02:50 crc kubenswrapper[4776]: I1204 10:02:50.908082 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.105122 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4478\" (UniqueName: \"kubernetes.io/projected/f995b914-c112-421f-b350-c8e3c4e029f1-kube-api-access-q4478\") pod \"f995b914-c112-421f-b350-c8e3c4e029f1\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.105202 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-combined-ca-bundle\") pod \"f995b914-c112-421f-b350-c8e3c4e029f1\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.105269 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-config-data\") pod \"f995b914-c112-421f-b350-c8e3c4e029f1\" (UID: \"f995b914-c112-421f-b350-c8e3c4e029f1\") " Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.105971 4776 generic.go:334] "Generic (PLEG): container finished" podID="f995b914-c112-421f-b350-c8e3c4e029f1" containerID="f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d" exitCode=137 Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.106026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f995b914-c112-421f-b350-c8e3c4e029f1","Type":"ContainerDied","Data":"f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d"} Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.106058 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f995b914-c112-421f-b350-c8e3c4e029f1","Type":"ContainerDied","Data":"9496de1d350228349f7148f823784d510674ceab2d56c142630757e3819aa546"} Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.106079 4776 scope.go:117] "RemoveContainer" containerID="f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.106228 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.112124 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f995b914-c112-421f-b350-c8e3c4e029f1-kube-api-access-q4478" (OuterVolumeSpecName: "kube-api-access-q4478") pod "f995b914-c112-421f-b350-c8e3c4e029f1" (UID: "f995b914-c112-421f-b350-c8e3c4e029f1"). InnerVolumeSpecName "kube-api-access-q4478". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.137262 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-config-data" (OuterVolumeSpecName: "config-data") pod "f995b914-c112-421f-b350-c8e3c4e029f1" (UID: "f995b914-c112-421f-b350-c8e3c4e029f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.138189 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f995b914-c112-421f-b350-c8e3c4e029f1" (UID: "f995b914-c112-421f-b350-c8e3c4e029f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.208538 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4478\" (UniqueName: \"kubernetes.io/projected/f995b914-c112-421f-b350-c8e3c4e029f1-kube-api-access-q4478\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.208579 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.208591 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f995b914-c112-421f-b350-c8e3c4e029f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.217723 4776 scope.go:117] "RemoveContainer" containerID="f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d" Dec 04 10:02:51 crc kubenswrapper[4776]: E1204 10:02:51.218303 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d\": container with ID starting with f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d not found: ID does not exist" containerID="f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.218350 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d"} err="failed to get container status \"f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d\": rpc error: code = NotFound desc = could not find container \"f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d\": container with ID starting with f97f78b7be838374e2ae44dfdfeea7b00b8d203206b367846c79ac28e638bd4d not found: ID does not exist" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.447530 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.465309 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.487267 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:51 crc kubenswrapper[4776]: E1204 10:02:51.487693 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f995b914-c112-421f-b350-c8e3c4e029f1" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.487712 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f995b914-c112-421f-b350-c8e3c4e029f1" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.487896 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f995b914-c112-421f-b350-c8e3c4e029f1" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.488647 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.491234 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.491988 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.492413 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.499027 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.615394 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.615602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.616047 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjkp\" (UniqueName: \"kubernetes.io/projected/ea0d417e-f205-4aa7-bc96-ba6879069b4a-kube-api-access-vxjkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.616149 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.616587 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.719036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.719485 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjkp\" (UniqueName: \"kubernetes.io/projected/ea0d417e-f205-4aa7-bc96-ba6879069b4a-kube-api-access-vxjkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.719589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.719675 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.719794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.723580 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.724030 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.724089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.731455 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea0d417e-f205-4aa7-bc96-ba6879069b4a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.738160 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjkp\" (UniqueName: \"kubernetes.io/projected/ea0d417e-f205-4aa7-bc96-ba6879069b4a-kube-api-access-vxjkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea0d417e-f205-4aa7-bc96-ba6879069b4a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:51 crc kubenswrapper[4776]: I1204 10:02:51.819355 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:52 crc kubenswrapper[4776]: I1204 10:02:52.273202 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 10:02:52 crc kubenswrapper[4776]: W1204 10:02:52.288044 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea0d417e_f205_4aa7_bc96_ba6879069b4a.slice/crio-c1be8f616e43febeaf8dc0915a86867d5553171d25da38ca9bf36ad7fe7b15ca WatchSource:0}: Error finding container c1be8f616e43febeaf8dc0915a86867d5553171d25da38ca9bf36ad7fe7b15ca: Status 404 returned error can't find the container with id c1be8f616e43febeaf8dc0915a86867d5553171d25da38ca9bf36ad7fe7b15ca Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.137483 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea0d417e-f205-4aa7-bc96-ba6879069b4a","Type":"ContainerStarted","Data":"cbd20b70ce3339d3c8307f5802c049d94de7c14430307d0ab7eb5f4d369ca12e"} Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.137799 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea0d417e-f205-4aa7-bc96-ba6879069b4a","Type":"ContainerStarted","Data":"c1be8f616e43febeaf8dc0915a86867d5553171d25da38ca9bf36ad7fe7b15ca"} Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.161374 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.161352572 podStartE2EDuration="2.161352572s" podCreationTimestamp="2025-12-04 10:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:53.157240624 +0000 UTC m=+1418.023721011" watchObservedRunningTime="2025-12-04 10:02:53.161352572 +0000 UTC m=+1418.027832949" Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.469378 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f995b914-c112-421f-b350-c8e3c4e029f1" path="/var/lib/kubelet/pods/f995b914-c112-421f-b350-c8e3c4e029f1/volumes" Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.541820 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.542377 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.542559 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:02:53 crc kubenswrapper[4776]: I1204 10:02:53.547467 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.145968 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.149722 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.343774 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8n47z"] Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.345791 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.375017 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8n47z"] Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.473692 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.473869 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qt9\" (UniqueName: \"kubernetes.io/projected/1a99db84-a067-4f3b-ae24-7f59633187d1-kube-api-access-r7qt9\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.473983 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.474017 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-dns-svc\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.474088 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-config\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.576227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.576350 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qt9\" (UniqueName: \"kubernetes.io/projected/1a99db84-a067-4f3b-ae24-7f59633187d1-kube-api-access-r7qt9\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.576426 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.576445 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-dns-svc\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.576468 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-config\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.577392 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.578177 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.578611 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-config\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.578699 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-dns-svc\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.602806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qt9\" (UniqueName: \"kubernetes.io/projected/1a99db84-a067-4f3b-ae24-7f59633187d1-kube-api-access-r7qt9\") pod \"dnsmasq-dns-5b856c5697-8n47z\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:54 crc kubenswrapper[4776]: I1204 10:02:54.670862 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:55 crc kubenswrapper[4776]: I1204 10:02:55.164248 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8n47z"] Dec 04 10:02:55 crc kubenswrapper[4776]: W1204 10:02:55.168632 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a99db84_a067_4f3b_ae24_7f59633187d1.slice/crio-84f6368f1557d33c1748e96e4ac02e02129b80e0b43dd53636433edc60a3e9f8 WatchSource:0}: Error finding container 84f6368f1557d33c1748e96e4ac02e02129b80e0b43dd53636433edc60a3e9f8: Status 404 returned error can't find the container with id 84f6368f1557d33c1748e96e4ac02e02129b80e0b43dd53636433edc60a3e9f8 Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.166635 4776 generic.go:334] "Generic (PLEG): container finished" podID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerID="f4e8b2c83a32e764c566f3bad35e9c10317ad406edb0eede1de8f67909d877a8" exitCode=0 Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.166705 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" event={"ID":"1a99db84-a067-4f3b-ae24-7f59633187d1","Type":"ContainerDied","Data":"f4e8b2c83a32e764c566f3bad35e9c10317ad406edb0eede1de8f67909d877a8"} Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.167087 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" event={"ID":"1a99db84-a067-4f3b-ae24-7f59633187d1","Type":"ContainerStarted","Data":"84f6368f1557d33c1748e96e4ac02e02129b80e0b43dd53636433edc60a3e9f8"} Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.799861 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.820181 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.908995 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.909549 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-central-agent" containerID="cri-o://750d80ede973f7baf821f529046d5199274ecf5ed45ae16d0f9769398694187e" gracePeriod=30 Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.909617 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="proxy-httpd" containerID="cri-o://be1ae40139ed9829dec88192d8855694041c6f0c8d8d6c161c1e426e08a76611" gracePeriod=30 Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.909653 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="sg-core" containerID="cri-o://8438854833ed832965bd81e2270334dc501c482ad4e734aba419d12f2745b800" gracePeriod=30 Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.909695 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-notification-agent" containerID="cri-o://d94138b8a0c74249c98290694c73f1dce1346e577b6fdf736b13d2eaccaa148b" gracePeriod=30 Dec 04 10:02:56 crc kubenswrapper[4776]: I1204 10:02:56.939783 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.188210 4776 generic.go:334] "Generic (PLEG): container finished" podID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerID="be1ae40139ed9829dec88192d8855694041c6f0c8d8d6c161c1e426e08a76611" exitCode=0 Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.188552 4776 generic.go:334] "Generic (PLEG): container finished" podID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerID="8438854833ed832965bd81e2270334dc501c482ad4e734aba419d12f2745b800" exitCode=2 Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.188286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerDied","Data":"be1ae40139ed9829dec88192d8855694041c6f0c8d8d6c161c1e426e08a76611"} Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.188618 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerDied","Data":"8438854833ed832965bd81e2270334dc501c482ad4e734aba419d12f2745b800"} Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.192746 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" event={"ID":"1a99db84-a067-4f3b-ae24-7f59633187d1","Type":"ContainerStarted","Data":"9c64e5e0c59887dc510d9dbe5d08cd9211d4d597f7ee07a7985ad35268186447"} Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.193133 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-log" containerID="cri-o://beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33" gracePeriod=30 Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.193530 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-api" containerID="cri-o://6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6" gracePeriod=30 Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.193964 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:02:57 crc kubenswrapper[4776]: I1204 10:02:57.225625 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" podStartSLOduration=3.225598804 podStartE2EDuration="3.225598804s" podCreationTimestamp="2025-12-04 10:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:02:57.224472939 +0000 UTC m=+1422.090953336" watchObservedRunningTime="2025-12-04 10:02:57.225598804 +0000 UTC m=+1422.092079181" Dec 04 10:02:58 crc kubenswrapper[4776]: I1204 10:02:58.211221 4776 generic.go:334] "Generic (PLEG): container finished" podID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerID="750d80ede973f7baf821f529046d5199274ecf5ed45ae16d0f9769398694187e" exitCode=0 Dec 04 10:02:58 crc kubenswrapper[4776]: I1204 10:02:58.213033 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerDied","Data":"750d80ede973f7baf821f529046d5199274ecf5ed45ae16d0f9769398694187e"} Dec 04 10:02:58 crc kubenswrapper[4776]: I1204 10:02:58.220296 4776 generic.go:334] "Generic (PLEG): container finished" podID="29811233-5318-47b6-8145-9057d38506f8" containerID="beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33" exitCode=143 Dec 04 10:02:58 crc kubenswrapper[4776]: I1204 10:02:58.222067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29811233-5318-47b6-8145-9057d38506f8","Type":"ContainerDied","Data":"beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33"} Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.243665 4776 generic.go:334] "Generic (PLEG): container finished" podID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerID="d94138b8a0c74249c98290694c73f1dce1346e577b6fdf736b13d2eaccaa148b" exitCode=0 Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.243735 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerDied","Data":"d94138b8a0c74249c98290694c73f1dce1346e577b6fdf736b13d2eaccaa148b"} Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.537287 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630571 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-ceilometer-tls-certs\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630632 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-combined-ca-bundle\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630766 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-log-httpd\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630788 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-run-httpd\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630812 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-config-data\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630890 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-sg-core-conf-yaml\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.630985 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-scripts\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.631014 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87x77\" (UniqueName: \"kubernetes.io/projected/192810e5-d32d-46e6-9296-9060df1a6b5c-kube-api-access-87x77\") pod \"192810e5-d32d-46e6-9296-9060df1a6b5c\" (UID: \"192810e5-d32d-46e6-9296-9060df1a6b5c\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.631780 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.632165 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.636435 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192810e5-d32d-46e6-9296-9060df1a6b5c-kube-api-access-87x77" (OuterVolumeSpecName: "kube-api-access-87x77") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "kube-api-access-87x77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.653691 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-scripts" (OuterVolumeSpecName: "scripts") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.682424 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.704367 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.733466 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.733513 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.733527 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192810e5-d32d-46e6-9296-9060df1a6b5c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.733542 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.733554 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.733564 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87x77\" (UniqueName: \"kubernetes.io/projected/192810e5-d32d-46e6-9296-9060df1a6b5c-kube-api-access-87x77\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.737373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.755067 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-config-data" (OuterVolumeSpecName: "config-data") pod "192810e5-d32d-46e6-9296-9060df1a6b5c" (UID: "192810e5-d32d-46e6-9296-9060df1a6b5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.768406 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.835126 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.835161 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192810e5-d32d-46e6-9296-9060df1a6b5c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.939648 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29811233-5318-47b6-8145-9057d38506f8-logs\") pod \"29811233-5318-47b6-8145-9057d38506f8\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.940068 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-265fx\" (UniqueName: \"kubernetes.io/projected/29811233-5318-47b6-8145-9057d38506f8-kube-api-access-265fx\") pod \"29811233-5318-47b6-8145-9057d38506f8\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.940148 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-config-data\") pod \"29811233-5318-47b6-8145-9057d38506f8\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.940258 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-combined-ca-bundle\") pod \"29811233-5318-47b6-8145-9057d38506f8\" (UID: \"29811233-5318-47b6-8145-9057d38506f8\") " Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.942761 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29811233-5318-47b6-8145-9057d38506f8-logs" (OuterVolumeSpecName: "logs") pod "29811233-5318-47b6-8145-9057d38506f8" (UID: "29811233-5318-47b6-8145-9057d38506f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.944535 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29811233-5318-47b6-8145-9057d38506f8-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.947342 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29811233-5318-47b6-8145-9057d38506f8-kube-api-access-265fx" (OuterVolumeSpecName: "kube-api-access-265fx") pod "29811233-5318-47b6-8145-9057d38506f8" (UID: "29811233-5318-47b6-8145-9057d38506f8"). InnerVolumeSpecName "kube-api-access-265fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.974039 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29811233-5318-47b6-8145-9057d38506f8" (UID: "29811233-5318-47b6-8145-9057d38506f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:00 crc kubenswrapper[4776]: I1204 10:03:00.993980 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-config-data" (OuterVolumeSpecName: "config-data") pod "29811233-5318-47b6-8145-9057d38506f8" (UID: "29811233-5318-47b6-8145-9057d38506f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.046030 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.046073 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-265fx\" (UniqueName: \"kubernetes.io/projected/29811233-5318-47b6-8145-9057d38506f8-kube-api-access-265fx\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.046113 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29811233-5318-47b6-8145-9057d38506f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.259359 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192810e5-d32d-46e6-9296-9060df1a6b5c","Type":"ContainerDied","Data":"424efd1773a085a4d2f219f4dbb5fed107241dc2828c7b2ec31c9763544a395a"} Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.260371 4776 scope.go:117] "RemoveContainer" containerID="be1ae40139ed9829dec88192d8855694041c6f0c8d8d6c161c1e426e08a76611" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.259970 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.264416 4776 generic.go:334] "Generic (PLEG): container finished" podID="29811233-5318-47b6-8145-9057d38506f8" containerID="6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6" exitCode=0 Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.264470 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.264469 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29811233-5318-47b6-8145-9057d38506f8","Type":"ContainerDied","Data":"6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6"} Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.265046 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29811233-5318-47b6-8145-9057d38506f8","Type":"ContainerDied","Data":"f5cf26ccc4b958b86cbac13df9189a9446f05f7e1da12485504328915c04ad34"} Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.283103 4776 scope.go:117] "RemoveContainer" containerID="8438854833ed832965bd81e2270334dc501c482ad4e734aba419d12f2745b800" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.306586 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.320751 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.333800 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.337473 4776 scope.go:117] "RemoveContainer" containerID="d94138b8a0c74249c98290694c73f1dce1346e577b6fdf736b13d2eaccaa148b" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.346417 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366002 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.366497 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-notification-agent" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366519 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-notification-agent" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.366543 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-api" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366554 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-api" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.366578 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="proxy-httpd" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366587 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="proxy-httpd" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.366603 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-central-agent" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366614 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-central-agent" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.366633 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="sg-core" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366640 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="sg-core" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.366656 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-log" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.366668 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-log" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.368861 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="proxy-httpd" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.368888 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-api" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.368904 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-notification-agent" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.368928 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="ceilometer-central-agent" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.368942 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="29811233-5318-47b6-8145-9057d38506f8" containerName="nova-api-log" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.368963 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" containerName="sg-core" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.370782 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.373637 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.377047 4776 scope.go:117] "RemoveContainer" containerID="750d80ede973f7baf821f529046d5199274ecf5ed45ae16d0f9769398694187e" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.377069 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.378768 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.395377 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.397865 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.401124 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.401398 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.401738 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.419738 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.438344 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.460129 4776 scope.go:117] "RemoveContainer" containerID="6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.467824 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192810e5-d32d-46e6-9296-9060df1a6b5c" path="/var/lib/kubelet/pods/192810e5-d32d-46e6-9296-9060df1a6b5c/volumes" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.468777 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29811233-5318-47b6-8145-9057d38506f8" path="/var/lib/kubelet/pods/29811233-5318-47b6-8145-9057d38506f8/volumes" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.495946 4776 scope.go:117] "RemoveContainer" containerID="beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.520450 4776 scope.go:117] "RemoveContainer" containerID="6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.521192 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6\": container with ID starting with 6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6 not found: ID does not exist" containerID="6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.521254 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6"} err="failed to get container status \"6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6\": rpc error: code = NotFound desc = could not find container \"6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6\": container with ID starting with 6df2de359d6a46531bd9714c729e48cb1e38b183aa45ee6f3cad523d91ea30e6 not found: ID does not exist" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.521288 4776 scope.go:117] "RemoveContainer" containerID="beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33" Dec 04 10:03:01 crc kubenswrapper[4776]: E1204 10:03:01.522121 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33\": container with ID starting with beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33 not found: ID does not exist" containerID="beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.522240 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33"} err="failed to get container status \"beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33\": rpc error: code = NotFound desc = could not find container \"beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33\": container with ID starting with beb8ceba514f77fabf59331cb39a366295b226799d26e871b05774b2bbaf3b33 not found: ID does not exist" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554605 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgqxp\" (UniqueName: \"kubernetes.io/projected/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-kube-api-access-tgqxp\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554675 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-scripts\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554708 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-config-data\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554744 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554779 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554807 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpp2\" (UniqueName: \"kubernetes.io/projected/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-kube-api-access-krpp2\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554833 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554876 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.554984 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-log-httpd\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.555017 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.555060 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-public-tls-certs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.555092 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-run-httpd\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.555109 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-logs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.555133 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-config-data\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656588 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656693 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-log-httpd\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656745 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656804 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-public-tls-certs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656845 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-run-httpd\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656865 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-logs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656895 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-config-data\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656960 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgqxp\" (UniqueName: \"kubernetes.io/projected/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-kube-api-access-tgqxp\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.656982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-scripts\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.657011 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-config-data\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.657043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.657076 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.657103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpp2\" (UniqueName: \"kubernetes.io/projected/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-kube-api-access-krpp2\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.657133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.658805 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-logs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.659155 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-log-httpd\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.659175 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-run-httpd\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.664185 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.667163 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-config-data\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.667649 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.668566 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-public-tls-certs\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.669119 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.669568 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-config-data\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.670151 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-scripts\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.670991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.677593 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.681558 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpp2\" (UniqueName: \"kubernetes.io/projected/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-kube-api-access-krpp2\") pod \"nova-api-0\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.683088 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgqxp\" (UniqueName: \"kubernetes.io/projected/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-kube-api-access-tgqxp\") pod \"ceilometer-0\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.703594 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.731086 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.819953 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:03:01 crc kubenswrapper[4776]: I1204 10:03:01.865659 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.258280 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.281896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerStarted","Data":"072f3b97f95a0260a2f6ea5d0b63d7f7959fdb0db3bff5992540046614d40e68"} Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.299812 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.395206 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.589041 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mmks9"] Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.590791 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.593786 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.594030 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.601902 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mmks9"] Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.788458 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-config-data\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.788653 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.788689 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6z5q\" (UniqueName: \"kubernetes.io/projected/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-kube-api-access-w6z5q\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.788878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-scripts\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.890233 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.890275 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6z5q\" (UniqueName: \"kubernetes.io/projected/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-kube-api-access-w6z5q\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.890328 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-scripts\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.890396 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-config-data\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.900542 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-config-data\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.906642 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.915188 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-scripts\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:02 crc kubenswrapper[4776]: I1204 10:03:02.941484 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6z5q\" (UniqueName: \"kubernetes.io/projected/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-kube-api-access-w6z5q\") pod \"nova-cell1-cell-mapping-mmks9\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.213608 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.312566 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerStarted","Data":"57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52"} Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.331215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53","Type":"ContainerStarted","Data":"f65c099625b3848f7997262df38cf95a673b7e15704966fc05df10481e3b1b16"} Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.331270 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53","Type":"ContainerStarted","Data":"152a683da8c47f0b2283a60e04e4ee69cc2f14d316db0e453c31942038fb3c6a"} Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.331287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53","Type":"ContainerStarted","Data":"880dac1934e1083357c1d6a860b47b9d2f432944975232f9f9889f76de06c62b"} Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.408763 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.408718894 podStartE2EDuration="2.408718894s" podCreationTimestamp="2025-12-04 10:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:03.357258002 +0000 UTC m=+1428.223738399" watchObservedRunningTime="2025-12-04 10:03:03.408718894 +0000 UTC m=+1428.275199271" Dec 04 10:03:03 crc kubenswrapper[4776]: I1204 10:03:03.744741 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mmks9"] Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.344105 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mmks9" event={"ID":"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b","Type":"ContainerStarted","Data":"0fad2eb09a04b456b7c4f3cc1e90eb7bfdcb5860475f5763188003645156b94b"} Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.344446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mmks9" event={"ID":"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b","Type":"ContainerStarted","Data":"c0f1dd122f6eb5d77224e7d03d3d46c82296e36e06cc4906e770e3831be5b7e2"} Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.348044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerStarted","Data":"707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9"} Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.370315 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mmks9" podStartSLOduration=2.370289263 podStartE2EDuration="2.370289263s" podCreationTimestamp="2025-12-04 10:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:04.36570635 +0000 UTC m=+1429.232186747" watchObservedRunningTime="2025-12-04 10:03:04.370289263 +0000 UTC m=+1429.236769640" Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.673638 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.781234 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-hkg68"] Dec 04 10:03:04 crc kubenswrapper[4776]: I1204 10:03:04.781587 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="dnsmasq-dns" containerID="cri-o://4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89" gracePeriod=10 Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.045651 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: connect: connection refused" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.340750 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.368895 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerStarted","Data":"a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917"} Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.371274 4776 generic.go:334] "Generic (PLEG): container finished" podID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerID="4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89" exitCode=0 Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.372060 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.372388 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" event={"ID":"e63d098b-b030-47b1-be04-01e6de4d6cc9","Type":"ContainerDied","Data":"4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89"} Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.372434 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-hkg68" event={"ID":"e63d098b-b030-47b1-be04-01e6de4d6cc9","Type":"ContainerDied","Data":"acf9ddfb34f34654e81d3461a4ff6766ae7672b657c09a791b937541593fd46a"} Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.372456 4776 scope.go:117] "RemoveContainer" containerID="4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.409906 4776 scope.go:117] "RemoveContainer" containerID="baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.459968 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46c5l\" (UniqueName: \"kubernetes.io/projected/e63d098b-b030-47b1-be04-01e6de4d6cc9-kube-api-access-46c5l\") pod \"e63d098b-b030-47b1-be04-01e6de4d6cc9\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.460099 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-nb\") pod \"e63d098b-b030-47b1-be04-01e6de4d6cc9\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.460195 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-config\") pod \"e63d098b-b030-47b1-be04-01e6de4d6cc9\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.460219 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-dns-svc\") pod \"e63d098b-b030-47b1-be04-01e6de4d6cc9\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.460387 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-sb\") pod \"e63d098b-b030-47b1-be04-01e6de4d6cc9\" (UID: \"e63d098b-b030-47b1-be04-01e6de4d6cc9\") " Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.460682 4776 scope.go:117] "RemoveContainer" containerID="4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89" Dec 04 10:03:05 crc kubenswrapper[4776]: E1204 10:03:05.463740 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89\": container with ID starting with 4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89 not found: ID does not exist" containerID="4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.463781 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89"} err="failed to get container status \"4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89\": rpc error: code = NotFound desc = could not find container \"4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89\": container with ID starting with 4d347dd11bc21f3d558c16b41b175ba9ac7115d92350b6b9805b0de6e6720a89 not found: ID does not exist" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.463814 4776 scope.go:117] "RemoveContainer" containerID="baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3" Dec 04 10:03:05 crc kubenswrapper[4776]: E1204 10:03:05.467510 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3\": container with ID starting with baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3 not found: ID does not exist" containerID="baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.467572 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3"} err="failed to get container status \"baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3\": rpc error: code = NotFound desc = could not find container \"baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3\": container with ID starting with baedeaf1b891cf7298d43ee0172db044bd48fb71a254e70d2b7bd287dc4bfae3 not found: ID does not exist" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.470179 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63d098b-b030-47b1-be04-01e6de4d6cc9-kube-api-access-46c5l" (OuterVolumeSpecName: "kube-api-access-46c5l") pod "e63d098b-b030-47b1-be04-01e6de4d6cc9" (UID: "e63d098b-b030-47b1-be04-01e6de4d6cc9"). InnerVolumeSpecName "kube-api-access-46c5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.541868 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-config" (OuterVolumeSpecName: "config") pod "e63d098b-b030-47b1-be04-01e6de4d6cc9" (UID: "e63d098b-b030-47b1-be04-01e6de4d6cc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.548750 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e63d098b-b030-47b1-be04-01e6de4d6cc9" (UID: "e63d098b-b030-47b1-be04-01e6de4d6cc9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.562633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e63d098b-b030-47b1-be04-01e6de4d6cc9" (UID: "e63d098b-b030-47b1-be04-01e6de4d6cc9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.566518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e63d098b-b030-47b1-be04-01e6de4d6cc9" (UID: "e63d098b-b030-47b1-be04-01e6de4d6cc9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.566587 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46c5l\" (UniqueName: \"kubernetes.io/projected/e63d098b-b030-47b1-be04-01e6de4d6cc9-kube-api-access-46c5l\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.566631 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.566644 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.566657 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.668265 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e63d098b-b030-47b1-be04-01e6de4d6cc9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.709988 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-hkg68"] Dec 04 10:03:05 crc kubenswrapper[4776]: I1204 10:03:05.721093 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-hkg68"] Dec 04 10:03:06 crc kubenswrapper[4776]: I1204 10:03:06.396436 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerStarted","Data":"56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7"} Dec 04 10:03:06 crc kubenswrapper[4776]: I1204 10:03:06.398296 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:03:07 crc kubenswrapper[4776]: I1204 10:03:07.465232 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" path="/var/lib/kubelet/pods/e63d098b-b030-47b1-be04-01e6de4d6cc9/volumes" Dec 04 10:03:10 crc kubenswrapper[4776]: I1204 10:03:10.456635 4776 generic.go:334] "Generic (PLEG): container finished" podID="4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" containerID="0fad2eb09a04b456b7c4f3cc1e90eb7bfdcb5860475f5763188003645156b94b" exitCode=0 Dec 04 10:03:10 crc kubenswrapper[4776]: I1204 10:03:10.456689 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mmks9" event={"ID":"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b","Type":"ContainerDied","Data":"0fad2eb09a04b456b7c4f3cc1e90eb7bfdcb5860475f5763188003645156b94b"} Dec 04 10:03:10 crc kubenswrapper[4776]: I1204 10:03:10.477186 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.887147944 podStartE2EDuration="9.477168626s" podCreationTimestamp="2025-12-04 10:03:01 +0000 UTC" firstStartedPulling="2025-12-04 10:03:02.263922498 +0000 UTC m=+1427.130402875" lastFinishedPulling="2025-12-04 10:03:05.85394317 +0000 UTC m=+1430.720423557" observedRunningTime="2025-12-04 10:03:06.432804896 +0000 UTC m=+1431.299285273" watchObservedRunningTime="2025-12-04 10:03:10.477168626 +0000 UTC m=+1435.343649003" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.732038 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.732357 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.846761 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.911335 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-scripts\") pod \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.911605 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-combined-ca-bundle\") pod \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.911748 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-config-data\") pod \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.911868 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6z5q\" (UniqueName: \"kubernetes.io/projected/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-kube-api-access-w6z5q\") pod \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\" (UID: \"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b\") " Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.922237 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-kube-api-access-w6z5q" (OuterVolumeSpecName: "kube-api-access-w6z5q") pod "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" (UID: "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b"). InnerVolumeSpecName "kube-api-access-w6z5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.929132 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-scripts" (OuterVolumeSpecName: "scripts") pod "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" (UID: "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.944903 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" (UID: "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:11 crc kubenswrapper[4776]: I1204 10:03:11.952708 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-config-data" (OuterVolumeSpecName: "config-data") pod "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" (UID: "4a4e1405-c0c0-4fb1-a9bb-a93612a2528b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.015333 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.015375 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.015388 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.015400 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6z5q\" (UniqueName: \"kubernetes.io/projected/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b-kube-api-access-w6z5q\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.481878 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mmks9" event={"ID":"4a4e1405-c0c0-4fb1-a9bb-a93612a2528b","Type":"ContainerDied","Data":"c0f1dd122f6eb5d77224e7d03d3d46c82296e36e06cc4906e770e3831be5b7e2"} Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.481934 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f1dd122f6eb5d77224e7d03d3d46c82296e36e06cc4906e770e3831be5b7e2" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.481991 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mmks9" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.687889 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.688750 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-log" containerID="cri-o://152a683da8c47f0b2283a60e04e4ee69cc2f14d316db0e453c31942038fb3c6a" gracePeriod=30 Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.689029 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-api" containerID="cri-o://f65c099625b3848f7997262df38cf95a673b7e15704966fc05df10481e3b1b16" gracePeriod=30 Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.709398 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.709652 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4de101ce-a0bf-4231-a572-a26b48e55f24" containerName="nova-scheduler-scheduler" containerID="cri-o://df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484" gracePeriod=30 Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.735718 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.735826 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.805633 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.805943 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-log" containerID="cri-o://5b5d9466438db62721cb4e543cab2c3ca1c0066bbe0bc9376a1a0ea7cc7ae6f4" gracePeriod=30 Dec 04 10:03:12 crc kubenswrapper[4776]: I1204 10:03:12.806512 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-metadata" containerID="cri-o://5f96f7f2d202753560fcf2a08a80ca0ca2eeff02be1d727aabc5620f3c969d51" gracePeriod=30 Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.493472 4776 generic.go:334] "Generic (PLEG): container finished" podID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerID="5b5d9466438db62721cb4e543cab2c3ca1c0066bbe0bc9376a1a0ea7cc7ae6f4" exitCode=143 Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.493545 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"97af0008-c8e6-4448-99c1-1a465bd92ac9","Type":"ContainerDied","Data":"5b5d9466438db62721cb4e543cab2c3ca1c0066bbe0bc9376a1a0ea7cc7ae6f4"} Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.495709 4776 generic.go:334] "Generic (PLEG): container finished" podID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerID="152a683da8c47f0b2283a60e04e4ee69cc2f14d316db0e453c31942038fb3c6a" exitCode=143 Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.495735 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53","Type":"ContainerDied","Data":"152a683da8c47f0b2283a60e04e4ee69cc2f14d316db0e453c31942038fb3c6a"} Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.931974 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g57lm"] Dec 04 10:03:13 crc kubenswrapper[4776]: E1204 10:03:13.932829 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="dnsmasq-dns" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.932844 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="dnsmasq-dns" Dec 04 10:03:13 crc kubenswrapper[4776]: E1204 10:03:13.932867 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" containerName="nova-manage" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.932874 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" containerName="nova-manage" Dec 04 10:03:13 crc kubenswrapper[4776]: E1204 10:03:13.932900 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="init" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.932906 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="init" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.933111 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" containerName="nova-manage" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.933124 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63d098b-b030-47b1-be04-01e6de4d6cc9" containerName="dnsmasq-dns" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.934698 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:13 crc kubenswrapper[4776]: I1204 10:03:13.966886 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g57lm"] Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.058317 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjgm\" (UniqueName: \"kubernetes.io/projected/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-kube-api-access-cmjgm\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.058663 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-catalog-content\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.058856 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-utilities\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.160525 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjgm\" (UniqueName: \"kubernetes.io/projected/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-kube-api-access-cmjgm\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.160590 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-catalog-content\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.161201 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-catalog-content\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.161267 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-utilities\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.161522 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-utilities\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.192123 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjgm\" (UniqueName: \"kubernetes.io/projected/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-kube-api-access-cmjgm\") pod \"redhat-operators-g57lm\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.257300 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:14 crc kubenswrapper[4776]: I1204 10:03:14.787777 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g57lm"] Dec 04 10:03:15 crc kubenswrapper[4776]: I1204 10:03:15.521712 4776 generic.go:334] "Generic (PLEG): container finished" podID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerID="fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81" exitCode=0 Dec 04 10:03:15 crc kubenswrapper[4776]: I1204 10:03:15.521775 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerDied","Data":"fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81"} Dec 04 10:03:15 crc kubenswrapper[4776]: I1204 10:03:15.523761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerStarted","Data":"1abc42164451414f815844e7b288595c4b9119da042693514d0efd5ad7eba9bc"} Dec 04 10:03:15 crc kubenswrapper[4776]: I1204 10:03:15.932723 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:36876->10.217.0.177:8775: read: connection reset by peer" Dec 04 10:03:15 crc kubenswrapper[4776]: I1204 10:03:15.932768 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:36878->10.217.0.177:8775: read: connection reset by peer" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.545829 4776 generic.go:334] "Generic (PLEG): container finished" podID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerID="5f96f7f2d202753560fcf2a08a80ca0ca2eeff02be1d727aabc5620f3c969d51" exitCode=0 Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.545955 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"97af0008-c8e6-4448-99c1-1a465bd92ac9","Type":"ContainerDied","Data":"5f96f7f2d202753560fcf2a08a80ca0ca2eeff02be1d727aabc5620f3c969d51"} Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.641384 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.719923 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-combined-ca-bundle\") pod \"97af0008-c8e6-4448-99c1-1a465bd92ac9\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.720096 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97af0008-c8e6-4448-99c1-1a465bd92ac9-logs\") pod \"97af0008-c8e6-4448-99c1-1a465bd92ac9\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.720120 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgcmz\" (UniqueName: \"kubernetes.io/projected/97af0008-c8e6-4448-99c1-1a465bd92ac9-kube-api-access-zgcmz\") pod \"97af0008-c8e6-4448-99c1-1a465bd92ac9\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.720205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-config-data\") pod \"97af0008-c8e6-4448-99c1-1a465bd92ac9\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.720227 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-nova-metadata-tls-certs\") pod \"97af0008-c8e6-4448-99c1-1a465bd92ac9\" (UID: \"97af0008-c8e6-4448-99c1-1a465bd92ac9\") " Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.721263 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97af0008-c8e6-4448-99c1-1a465bd92ac9-logs" (OuterVolumeSpecName: "logs") pod "97af0008-c8e6-4448-99c1-1a465bd92ac9" (UID: "97af0008-c8e6-4448-99c1-1a465bd92ac9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.754485 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97af0008-c8e6-4448-99c1-1a465bd92ac9-kube-api-access-zgcmz" (OuterVolumeSpecName: "kube-api-access-zgcmz") pod "97af0008-c8e6-4448-99c1-1a465bd92ac9" (UID: "97af0008-c8e6-4448-99c1-1a465bd92ac9"). InnerVolumeSpecName "kube-api-access-zgcmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.760748 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97af0008-c8e6-4448-99c1-1a465bd92ac9" (UID: "97af0008-c8e6-4448-99c1-1a465bd92ac9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.762753 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-config-data" (OuterVolumeSpecName: "config-data") pod "97af0008-c8e6-4448-99c1-1a465bd92ac9" (UID: "97af0008-c8e6-4448-99c1-1a465bd92ac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.800599 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "97af0008-c8e6-4448-99c1-1a465bd92ac9" (UID: "97af0008-c8e6-4448-99c1-1a465bd92ac9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.822122 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.822165 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97af0008-c8e6-4448-99c1-1a465bd92ac9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.822176 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgcmz\" (UniqueName: \"kubernetes.io/projected/97af0008-c8e6-4448-99c1-1a465bd92ac9-kube-api-access-zgcmz\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.822187 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:16 crc kubenswrapper[4776]: I1204 10:03:16.822200 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/97af0008-c8e6-4448-99c1-1a465bd92ac9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:17 crc kubenswrapper[4776]: E1204 10:03:17.190442 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484 is running failed: container process not found" containerID="df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:03:17 crc kubenswrapper[4776]: E1204 10:03:17.191502 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484 is running failed: container process not found" containerID="df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:03:17 crc kubenswrapper[4776]: E1204 10:03:17.192053 4776 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484 is running failed: container process not found" containerID="df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 10:03:17 crc kubenswrapper[4776]: E1204 10:03:17.192121 4776 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4de101ce-a0bf-4231-a572-a26b48e55f24" containerName="nova-scheduler-scheduler" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.558022 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"97af0008-c8e6-4448-99c1-1a465bd92ac9","Type":"ContainerDied","Data":"fa27d66a40bc1a31f5c4a8a3d0ecf53ac145cb03a76a375d10283b51637be45f"} Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.558100 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.558125 4776 scope.go:117] "RemoveContainer" containerID="5f96f7f2d202753560fcf2a08a80ca0ca2eeff02be1d727aabc5620f3c969d51" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.564178 4776 generic.go:334] "Generic (PLEG): container finished" podID="4de101ce-a0bf-4231-a572-a26b48e55f24" containerID="df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484" exitCode=0 Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.564215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4de101ce-a0bf-4231-a572-a26b48e55f24","Type":"ContainerDied","Data":"df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484"} Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.588811 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.591264 4776 scope.go:117] "RemoveContainer" containerID="5b5d9466438db62721cb4e543cab2c3ca1c0066bbe0bc9376a1a0ea7cc7ae6f4" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.604972 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.623742 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:03:17 crc kubenswrapper[4776]: E1204 10:03:17.624142 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-log" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.624156 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-log" Dec 04 10:03:17 crc kubenswrapper[4776]: E1204 10:03:17.624174 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-metadata" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.624180 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-metadata" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.624354 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-metadata" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.624376 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" containerName="nova-metadata-log" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.625304 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.629783 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.630115 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.638976 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.741982 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.742099 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvjkc\" (UniqueName: \"kubernetes.io/projected/840a5d28-ff84-411a-837a-5976118c262d-kube-api-access-zvjkc\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.742133 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/840a5d28-ff84-411a-837a-5976118c262d-logs\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.742149 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.742221 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-config-data\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.843871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.844323 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvjkc\" (UniqueName: \"kubernetes.io/projected/840a5d28-ff84-411a-837a-5976118c262d-kube-api-access-zvjkc\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.844342 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/840a5d28-ff84-411a-837a-5976118c262d-logs\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.844360 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.844604 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-config-data\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.845379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/840a5d28-ff84-411a-837a-5976118c262d-logs\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.851834 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.851850 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.851990 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840a5d28-ff84-411a-837a-5976118c262d-config-data\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.865201 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvjkc\" (UniqueName: \"kubernetes.io/projected/840a5d28-ff84-411a-837a-5976118c262d-kube-api-access-zvjkc\") pod \"nova-metadata-0\" (UID: \"840a5d28-ff84-411a-837a-5976118c262d\") " pod="openstack/nova-metadata-0" Dec 04 10:03:17 crc kubenswrapper[4776]: I1204 10:03:17.951427 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.087835 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.150963 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-config-data\") pod \"4de101ce-a0bf-4231-a572-a26b48e55f24\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.151053 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-combined-ca-bundle\") pod \"4de101ce-a0bf-4231-a572-a26b48e55f24\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.151101 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kncpk\" (UniqueName: \"kubernetes.io/projected/4de101ce-a0bf-4231-a572-a26b48e55f24-kube-api-access-kncpk\") pod \"4de101ce-a0bf-4231-a572-a26b48e55f24\" (UID: \"4de101ce-a0bf-4231-a572-a26b48e55f24\") " Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.167718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de101ce-a0bf-4231-a572-a26b48e55f24-kube-api-access-kncpk" (OuterVolumeSpecName: "kube-api-access-kncpk") pod "4de101ce-a0bf-4231-a572-a26b48e55f24" (UID: "4de101ce-a0bf-4231-a572-a26b48e55f24"). InnerVolumeSpecName "kube-api-access-kncpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.198203 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-config-data" (OuterVolumeSpecName: "config-data") pod "4de101ce-a0bf-4231-a572-a26b48e55f24" (UID: "4de101ce-a0bf-4231-a572-a26b48e55f24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.200549 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de101ce-a0bf-4231-a572-a26b48e55f24" (UID: "4de101ce-a0bf-4231-a572-a26b48e55f24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.253758 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.253810 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de101ce-a0bf-4231-a572-a26b48e55f24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.253826 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kncpk\" (UniqueName: \"kubernetes.io/projected/4de101ce-a0bf-4231-a572-a26b48e55f24-kube-api-access-kncpk\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:18 crc kubenswrapper[4776]: W1204 10:03:18.476120 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod840a5d28_ff84_411a_837a_5976118c262d.slice/crio-0b04e12406806329cf9ef5af3a7986e339c8d6f46a40e4827a879957e0cc4003 WatchSource:0}: Error finding container 0b04e12406806329cf9ef5af3a7986e339c8d6f46a40e4827a879957e0cc4003: Status 404 returned error can't find the container with id 0b04e12406806329cf9ef5af3a7986e339c8d6f46a40e4827a879957e0cc4003 Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.480958 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.579093 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerStarted","Data":"7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f"} Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.587642 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"840a5d28-ff84-411a-837a-5976118c262d","Type":"ContainerStarted","Data":"0b04e12406806329cf9ef5af3a7986e339c8d6f46a40e4827a879957e0cc4003"} Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.590580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4de101ce-a0bf-4231-a572-a26b48e55f24","Type":"ContainerDied","Data":"e6659a70f65513d5e15f7187bc7c598dc85e2a14b2136dbfd3755b8b93efeb62"} Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.590611 4776 scope.go:117] "RemoveContainer" containerID="df0e6cda7c5cf8619f1a087d6feff2e8abebb06036121a5e6aeb4caddbfdf484" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.590766 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.644752 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.659619 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.679897 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:03:18 crc kubenswrapper[4776]: E1204 10:03:18.680445 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de101ce-a0bf-4231-a572-a26b48e55f24" containerName="nova-scheduler-scheduler" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.680457 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de101ce-a0bf-4231-a572-a26b48e55f24" containerName="nova-scheduler-scheduler" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.680655 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de101ce-a0bf-4231-a572-a26b48e55f24" containerName="nova-scheduler-scheduler" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.681450 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.695670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.735227 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.769230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3abc5a5-f26f-4c50-9780-b79f683b4243-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.769369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqhb\" (UniqueName: \"kubernetes.io/projected/a3abc5a5-f26f-4c50-9780-b79f683b4243-kube-api-access-wqqhb\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.769446 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3abc5a5-f26f-4c50-9780-b79f683b4243-config-data\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: E1204 10:03:18.826669 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de101ce_a0bf_4231_a572_a26b48e55f24.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de101ce_a0bf_4231_a572_a26b48e55f24.slice/crio-e6659a70f65513d5e15f7187bc7c598dc85e2a14b2136dbfd3755b8b93efeb62\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc01f0a3b_848f_4278_8ed4_4ef03122aeb2.slice/crio-7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.871090 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3abc5a5-f26f-4c50-9780-b79f683b4243-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.871171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqhb\" (UniqueName: \"kubernetes.io/projected/a3abc5a5-f26f-4c50-9780-b79f683b4243-kube-api-access-wqqhb\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.871231 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3abc5a5-f26f-4c50-9780-b79f683b4243-config-data\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.877232 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3abc5a5-f26f-4c50-9780-b79f683b4243-config-data\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.877270 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3abc5a5-f26f-4c50-9780-b79f683b4243-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:18 crc kubenswrapper[4776]: I1204 10:03:18.889234 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqhb\" (UniqueName: \"kubernetes.io/projected/a3abc5a5-f26f-4c50-9780-b79f683b4243-kube-api-access-wqqhb\") pod \"nova-scheduler-0\" (UID: \"a3abc5a5-f26f-4c50-9780-b79f683b4243\") " pod="openstack/nova-scheduler-0" Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.072893 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.466544 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de101ce-a0bf-4231-a572-a26b48e55f24" path="/var/lib/kubelet/pods/4de101ce-a0bf-4231-a572-a26b48e55f24/volumes" Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.467622 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97af0008-c8e6-4448-99c1-1a465bd92ac9" path="/var/lib/kubelet/pods/97af0008-c8e6-4448-99c1-1a465bd92ac9/volumes" Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.521062 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 10:03:19 crc kubenswrapper[4776]: W1204 10:03:19.521101 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3abc5a5_f26f_4c50_9780_b79f683b4243.slice/crio-bc1ff49a32a0f0011030151d4761b74c94b8f6b1391ad79ca844301e05b32c7d WatchSource:0}: Error finding container bc1ff49a32a0f0011030151d4761b74c94b8f6b1391ad79ca844301e05b32c7d: Status 404 returned error can't find the container with id bc1ff49a32a0f0011030151d4761b74c94b8f6b1391ad79ca844301e05b32c7d Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.610754 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3abc5a5-f26f-4c50-9780-b79f683b4243","Type":"ContainerStarted","Data":"bc1ff49a32a0f0011030151d4761b74c94b8f6b1391ad79ca844301e05b32c7d"} Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.614601 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"840a5d28-ff84-411a-837a-5976118c262d","Type":"ContainerStarted","Data":"f50396ff62afc7b3234671fd2997af8fa41de04e04ad498f0d6b80fd49ace879"} Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.617467 4776 generic.go:334] "Generic (PLEG): container finished" podID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerID="f65c099625b3848f7997262df38cf95a673b7e15704966fc05df10481e3b1b16" exitCode=0 Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.617536 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53","Type":"ContainerDied","Data":"f65c099625b3848f7997262df38cf95a673b7e15704966fc05df10481e3b1b16"} Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.619413 4776 generic.go:334] "Generic (PLEG): container finished" podID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerID="7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f" exitCode=0 Dec 04 10:03:19 crc kubenswrapper[4776]: I1204 10:03:19.619452 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerDied","Data":"7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f"} Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.469821 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.504907 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-public-tls-certs\") pod \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.504975 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-combined-ca-bundle\") pod \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.505118 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-config-data\") pod \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.505437 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-internal-tls-certs\") pod \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.505527 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krpp2\" (UniqueName: \"kubernetes.io/projected/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-kube-api-access-krpp2\") pod \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.505550 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-logs\") pod \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\" (UID: \"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53\") " Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.506302 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-logs" (OuterVolumeSpecName: "logs") pod "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" (UID: "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.510279 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-kube-api-access-krpp2" (OuterVolumeSpecName: "kube-api-access-krpp2") pod "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" (UID: "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53"). InnerVolumeSpecName "kube-api-access-krpp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.531898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-config-data" (OuterVolumeSpecName: "config-data") pod "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" (UID: "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.535047 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" (UID: "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.559393 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" (UID: "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.569002 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" (UID: "e1ff72ef-2c73-46b7-afcf-7f04bbba5a53"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.607653 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.607718 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.607733 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.607762 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.607777 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krpp2\" (UniqueName: \"kubernetes.io/projected/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-kube-api-access-krpp2\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.607791 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.630224 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"840a5d28-ff84-411a-837a-5976118c262d","Type":"ContainerStarted","Data":"f0d0810bada27e716e25d04c2fd79fcd3b466bab40a38ff4a3cdf42e317a9cca"} Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.634202 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3abc5a5-f26f-4c50-9780-b79f683b4243","Type":"ContainerStarted","Data":"707d222277c26fc6bd2108495cc76ffd1fe530097c37e919a2503183a2db9599"} Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.637527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerStarted","Data":"d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42"} Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.639851 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e1ff72ef-2c73-46b7-afcf-7f04bbba5a53","Type":"ContainerDied","Data":"880dac1934e1083357c1d6a860b47b9d2f432944975232f9f9889f76de06c62b"} Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.639887 4776 scope.go:117] "RemoveContainer" containerID="f65c099625b3848f7997262df38cf95a673b7e15704966fc05df10481e3b1b16" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.640062 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.656188 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.656164736 podStartE2EDuration="3.656164736s" podCreationTimestamp="2025-12-04 10:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:20.64862499 +0000 UTC m=+1445.515105377" watchObservedRunningTime="2025-12-04 10:03:20.656164736 +0000 UTC m=+1445.522645113" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.677376 4776 scope.go:117] "RemoveContainer" containerID="152a683da8c47f0b2283a60e04e4ee69cc2f14d316db0e453c31942038fb3c6a" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.688537 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g57lm" podStartSLOduration=3.097965057 podStartE2EDuration="7.688517548s" podCreationTimestamp="2025-12-04 10:03:13 +0000 UTC" firstStartedPulling="2025-12-04 10:03:15.523210111 +0000 UTC m=+1440.389690488" lastFinishedPulling="2025-12-04 10:03:20.113762602 +0000 UTC m=+1444.980242979" observedRunningTime="2025-12-04 10:03:20.673144688 +0000 UTC m=+1445.539625065" watchObservedRunningTime="2025-12-04 10:03:20.688517548 +0000 UTC m=+1445.554997925" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.708399 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7083829 podStartE2EDuration="2.7083829s" podCreationTimestamp="2025-12-04 10:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:20.701790945 +0000 UTC m=+1445.568271352" watchObservedRunningTime="2025-12-04 10:03:20.7083829 +0000 UTC m=+1445.574863277" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.724669 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.748287 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.764526 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:20 crc kubenswrapper[4776]: E1204 10:03:20.765016 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-log" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.765036 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-log" Dec 04 10:03:20 crc kubenswrapper[4776]: E1204 10:03:20.765065 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-api" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.765071 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-api" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.765301 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-log" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.765328 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" containerName="nova-api-api" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.766488 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.773964 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.774187 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.774874 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.776842 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.811019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.811114 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.811186 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjmv\" (UniqueName: \"kubernetes.io/projected/519810d5-1e42-413c-893d-81e992b49d5b-kube-api-access-dxjmv\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.811222 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-config-data\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.811257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/519810d5-1e42-413c-893d-81e992b49d5b-logs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.811560 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-public-tls-certs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.913622 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-public-tls-certs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.913705 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.913768 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.913813 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjmv\" (UniqueName: \"kubernetes.io/projected/519810d5-1e42-413c-893d-81e992b49d5b-kube-api-access-dxjmv\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.913854 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-config-data\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.913887 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/519810d5-1e42-413c-893d-81e992b49d5b-logs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.914326 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/519810d5-1e42-413c-893d-81e992b49d5b-logs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.917370 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.917939 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-public-tls-certs\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.917982 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.918303 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/519810d5-1e42-413c-893d-81e992b49d5b-config-data\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:20 crc kubenswrapper[4776]: I1204 10:03:20.934577 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjmv\" (UniqueName: \"kubernetes.io/projected/519810d5-1e42-413c-893d-81e992b49d5b-kube-api-access-dxjmv\") pod \"nova-api-0\" (UID: \"519810d5-1e42-413c-893d-81e992b49d5b\") " pod="openstack/nova-api-0" Dec 04 10:03:21 crc kubenswrapper[4776]: I1204 10:03:21.083897 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 10:03:21 crc kubenswrapper[4776]: I1204 10:03:21.464754 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ff72ef-2c73-46b7-afcf-7f04bbba5a53" path="/var/lib/kubelet/pods/e1ff72ef-2c73-46b7-afcf-7f04bbba5a53/volumes" Dec 04 10:03:21 crc kubenswrapper[4776]: I1204 10:03:21.553657 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 10:03:21 crc kubenswrapper[4776]: I1204 10:03:21.655646 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"519810d5-1e42-413c-893d-81e992b49d5b","Type":"ContainerStarted","Data":"8ae1dc5349ab79ca3f0944cf9e5db66f941bbd4d9fc60396d145dce4b4cdd9d6"} Dec 04 10:03:22 crc kubenswrapper[4776]: I1204 10:03:22.688696 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"519810d5-1e42-413c-893d-81e992b49d5b","Type":"ContainerStarted","Data":"932c8f1043a459c1fcfbaf1a062a88f0829dc0f0c6608a1779ed871324a3098f"} Dec 04 10:03:22 crc kubenswrapper[4776]: I1204 10:03:22.689059 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"519810d5-1e42-413c-893d-81e992b49d5b","Type":"ContainerStarted","Data":"ff622ac9faadcdc4a3c5c0941676175a737317313e48d4fa67b1264f0fd119fb"} Dec 04 10:03:22 crc kubenswrapper[4776]: I1204 10:03:22.723260 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.72319885 podStartE2EDuration="2.72319885s" podCreationTimestamp="2025-12-04 10:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:03:22.71713813 +0000 UTC m=+1447.583618527" watchObservedRunningTime="2025-12-04 10:03:22.72319885 +0000 UTC m=+1447.589679227" Dec 04 10:03:22 crc kubenswrapper[4776]: I1204 10:03:22.951892 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:03:22 crc kubenswrapper[4776]: I1204 10:03:22.951976 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 10:03:24 crc kubenswrapper[4776]: I1204 10:03:24.078017 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 10:03:24 crc kubenswrapper[4776]: I1204 10:03:24.258297 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:24 crc kubenswrapper[4776]: I1204 10:03:24.258367 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:25 crc kubenswrapper[4776]: I1204 10:03:25.305221 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g57lm" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="registry-server" probeResult="failure" output=< Dec 04 10:03:25 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 10:03:25 crc kubenswrapper[4776]: > Dec 04 10:03:27 crc kubenswrapper[4776]: I1204 10:03:27.951668 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:03:27 crc kubenswrapper[4776]: I1204 10:03:27.952041 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 10:03:28 crc kubenswrapper[4776]: I1204 10:03:28.969187 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="840a5d28-ff84-411a-837a-5976118c262d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:03:28 crc kubenswrapper[4776]: I1204 10:03:28.969191 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="840a5d28-ff84-411a-837a-5976118c262d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:03:29 crc kubenswrapper[4776]: I1204 10:03:29.073310 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 10:03:29 crc kubenswrapper[4776]: I1204 10:03:29.103481 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 10:03:29 crc kubenswrapper[4776]: I1204 10:03:29.782009 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 10:03:31 crc kubenswrapper[4776]: I1204 10:03:31.084392 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:03:31 crc kubenswrapper[4776]: I1204 10:03:31.084468 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 10:03:31 crc kubenswrapper[4776]: I1204 10:03:31.716543 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:03:32 crc kubenswrapper[4776]: I1204 10:03:32.104152 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="519810d5-1e42-413c-893d-81e992b49d5b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:03:32 crc kubenswrapper[4776]: I1204 10:03:32.104459 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="519810d5-1e42-413c-893d-81e992b49d5b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 10:03:34 crc kubenswrapper[4776]: I1204 10:03:34.314181 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:34 crc kubenswrapper[4776]: I1204 10:03:34.386811 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:34 crc kubenswrapper[4776]: I1204 10:03:34.562981 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g57lm"] Dec 04 10:03:35 crc kubenswrapper[4776]: I1204 10:03:35.807234 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g57lm" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="registry-server" containerID="cri-o://d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42" gracePeriod=2 Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.284776 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.332540 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjgm\" (UniqueName: \"kubernetes.io/projected/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-kube-api-access-cmjgm\") pod \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.332611 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-utilities\") pod \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.332699 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-catalog-content\") pod \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\" (UID: \"c01f0a3b-848f-4278-8ed4-4ef03122aeb2\") " Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.333359 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-utilities" (OuterVolumeSpecName: "utilities") pod "c01f0a3b-848f-4278-8ed4-4ef03122aeb2" (UID: "c01f0a3b-848f-4278-8ed4-4ef03122aeb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.335378 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.339957 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-kube-api-access-cmjgm" (OuterVolumeSpecName: "kube-api-access-cmjgm") pod "c01f0a3b-848f-4278-8ed4-4ef03122aeb2" (UID: "c01f0a3b-848f-4278-8ed4-4ef03122aeb2"). InnerVolumeSpecName "kube-api-access-cmjgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.436870 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmjgm\" (UniqueName: \"kubernetes.io/projected/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-kube-api-access-cmjgm\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.459133 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c01f0a3b-848f-4278-8ed4-4ef03122aeb2" (UID: "c01f0a3b-848f-4278-8ed4-4ef03122aeb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.540025 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c01f0a3b-848f-4278-8ed4-4ef03122aeb2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.822702 4776 generic.go:334] "Generic (PLEG): container finished" podID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerID="d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42" exitCode=0 Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.822757 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerDied","Data":"d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42"} Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.822784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g57lm" event={"ID":"c01f0a3b-848f-4278-8ed4-4ef03122aeb2","Type":"ContainerDied","Data":"1abc42164451414f815844e7b288595c4b9119da042693514d0efd5ad7eba9bc"} Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.822792 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g57lm" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.822801 4776 scope.go:117] "RemoveContainer" containerID="d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.846994 4776 scope.go:117] "RemoveContainer" containerID="7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.862033 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g57lm"] Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.869663 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g57lm"] Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.877277 4776 scope.go:117] "RemoveContainer" containerID="fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.907010 4776 scope.go:117] "RemoveContainer" containerID="d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42" Dec 04 10:03:36 crc kubenswrapper[4776]: E1204 10:03:36.907538 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42\": container with ID starting with d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42 not found: ID does not exist" containerID="d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.907569 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42"} err="failed to get container status \"d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42\": rpc error: code = NotFound desc = could not find container \"d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42\": container with ID starting with d73754ac47e2f9189732e265b6eba4d2fd71843c4d570c1f1ca9f86cb7f98f42 not found: ID does not exist" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.907591 4776 scope.go:117] "RemoveContainer" containerID="7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f" Dec 04 10:03:36 crc kubenswrapper[4776]: E1204 10:03:36.907884 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f\": container with ID starting with 7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f not found: ID does not exist" containerID="7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.907908 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f"} err="failed to get container status \"7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f\": rpc error: code = NotFound desc = could not find container \"7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f\": container with ID starting with 7fe1de89d349557573a3645b6f0d2081741861857883df6d894ae15ad852b15f not found: ID does not exist" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.907940 4776 scope.go:117] "RemoveContainer" containerID="fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81" Dec 04 10:03:36 crc kubenswrapper[4776]: E1204 10:03:36.908294 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81\": container with ID starting with fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81 not found: ID does not exist" containerID="fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81" Dec 04 10:03:36 crc kubenswrapper[4776]: I1204 10:03:36.908335 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81"} err="failed to get container status \"fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81\": rpc error: code = NotFound desc = could not find container \"fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81\": container with ID starting with fe659c268137a00429a63582a49d4675c98877e1fde361f705e0b48a9879eb81 not found: ID does not exist" Dec 04 10:03:37 crc kubenswrapper[4776]: I1204 10:03:37.464010 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" path="/var/lib/kubelet/pods/c01f0a3b-848f-4278-8ed4-4ef03122aeb2/volumes" Dec 04 10:03:37 crc kubenswrapper[4776]: I1204 10:03:37.957233 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:03:37 crc kubenswrapper[4776]: I1204 10:03:37.957460 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 10:03:37 crc kubenswrapper[4776]: I1204 10:03:37.963534 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:03:37 crc kubenswrapper[4776]: I1204 10:03:37.964297 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 10:03:41 crc kubenswrapper[4776]: I1204 10:03:41.092243 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:03:41 crc kubenswrapper[4776]: I1204 10:03:41.092833 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 10:03:41 crc kubenswrapper[4776]: I1204 10:03:41.093155 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:03:41 crc kubenswrapper[4776]: I1204 10:03:41.093196 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 10:03:41 crc kubenswrapper[4776]: I1204 10:03:41.101248 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:03:41 crc kubenswrapper[4776]: I1204 10:03:41.102609 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 10:03:50 crc kubenswrapper[4776]: I1204 10:03:50.420508 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:03:51 crc kubenswrapper[4776]: I1204 10:03:51.242211 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:03:55 crc kubenswrapper[4776]: I1204 10:03:55.053508 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="rabbitmq" containerID="cri-o://7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d" gracePeriod=604796 Dec 04 10:03:55 crc kubenswrapper[4776]: I1204 10:03:55.737654 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="rabbitmq" containerID="cri-o://ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca" gracePeriod=604796 Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.023839 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.403401 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.640710 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738263 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65d6s\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-kube-api-access-65d6s\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738351 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-config-data\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738398 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-pod-info\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-confd\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738561 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-erlang-cookie-secret\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738607 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-erlang-cookie\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738666 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-tls\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738691 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-server-conf\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738727 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-plugins-conf\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738755 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-plugins\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.738828 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\" (UID: \"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e\") " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.739599 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.739779 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.739843 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.746304 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.746468 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-pod-info" (OuterVolumeSpecName: "pod-info") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.752228 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.752949 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-kube-api-access-65d6s" (OuterVolumeSpecName: "kube-api-access-65d6s") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "kube-api-access-65d6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.788463 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.804113 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-config-data" (OuterVolumeSpecName: "config-data") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.832862 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-server-conf" (OuterVolumeSpecName: "server-conf") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841099 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841141 4776 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841157 4776 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841169 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841198 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841213 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65d6s\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-kube-api-access-65d6s\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841227 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841240 4776 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841252 4776 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.841264 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.896468 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.923749 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" (UID: "1b1b8bd1-3c18-4127-bb66-a3f99b106b8e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.943424 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:01 crc kubenswrapper[4776]: I1204 10:04:01.943466 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.067356 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerID="7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d" exitCode=0 Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.067406 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e","Type":"ContainerDied","Data":"7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d"} Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.067435 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b1b8bd1-3c18-4127-bb66-a3f99b106b8e","Type":"ContainerDied","Data":"fe98d8f101b49fc76afc5392f96b7b9cfc8995764ca7b4d72002dad3d6a9117f"} Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.067453 4776 scope.go:117] "RemoveContainer" containerID="7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.067606 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.159966 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.164163 4776 scope.go:117] "RemoveContainer" containerID="b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.173826 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.198224 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.198789 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="registry-server" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.198832 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="registry-server" Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.198856 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="extract-utilities" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.198864 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="extract-utilities" Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.198904 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="rabbitmq" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.198925 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="rabbitmq" Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.198963 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="extract-content" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.198995 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="extract-content" Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.199070 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="setup-container" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.199082 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="setup-container" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.199416 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" containerName="rabbitmq" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.199438 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01f0a3b-848f-4278-8ed4-4ef03122aeb2" containerName="registry-server" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.201054 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.205088 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.206570 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.206816 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.206959 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jxvm9" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.207134 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.208513 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.211999 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.242501 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.248076 4776 scope.go:117] "RemoveContainer" containerID="7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d" Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.252030 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d\": container with ID starting with 7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d not found: ID does not exist" containerID="7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.252068 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d"} err="failed to get container status \"7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d\": rpc error: code = NotFound desc = could not find container \"7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d\": container with ID starting with 7da670dd0ffc0608183b33a160fbcca393ab32a10f514259c060c08d43b6ff6d not found: ID does not exist" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.252092 4776 scope.go:117] "RemoveContainer" containerID="b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c" Dec 04 10:04:02 crc kubenswrapper[4776]: E1204 10:04:02.253236 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c\": container with ID starting with b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c not found: ID does not exist" containerID="b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.253290 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c"} err="failed to get container status \"b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c\": rpc error: code = NotFound desc = could not find container \"b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c\": container with ID starting with b3555c8dd05aa4b2bc15c3ab727e5bc2259436e7c875db3cd332992042699a3c not found: ID does not exist" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.356263 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.356879 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txf4g\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-kube-api-access-txf4g\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357036 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357146 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357236 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357355 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357447 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357715 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357824 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.357939 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460186 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460239 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460267 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460370 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460411 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460477 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460512 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txf4g\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-kube-api-access-txf4g\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460549 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.460605 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.461661 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.462041 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.465161 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.465466 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.466136 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.466642 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.474941 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.481665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.481969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.484141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.488078 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txf4g\" (UniqueName: \"kubernetes.io/projected/6b295c1b-fc2b-4e58-9175-992ce31b3a3c-kube-api-access-txf4g\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.515904 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b295c1b-fc2b-4e58-9175-992ce31b3a3c\") " pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.522469 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.607242 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664141 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-erlang-cookie\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-plugins-conf\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664507 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-tls\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664527 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-config-data\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664551 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-pod-info\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664603 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-plugins\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664636 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664728 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-confd\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664769 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-erlang-cookie-secret\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664791 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-server-conf\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.664816 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2nln\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-kube-api-access-c2nln\") pod \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\" (UID: \"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c\") " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.668401 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.668896 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.669566 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.677306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.677600 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-kube-api-access-c2nln" (OuterVolumeSpecName: "kube-api-access-c2nln") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "kube-api-access-c2nln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.679735 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.679846 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.684361 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-pod-info" (OuterVolumeSpecName: "pod-info") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.721744 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-config-data" (OuterVolumeSpecName: "config-data") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.744202 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-server-conf" (OuterVolumeSpecName: "server-conf") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767456 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767488 4776 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767499 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767506 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767515 4776 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767523 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767553 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767565 4776 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767573 4776 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.767582 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2nln\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-kube-api-access-c2nln\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.792152 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.827049 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" (UID: "0ba805ac-f1c5-4049-bb56-7dfff0ccc76c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.869792 4776 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:02 crc kubenswrapper[4776]: I1204 10:04:02.869828 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.065838 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 10:04:03 crc kubenswrapper[4776]: W1204 10:04:03.068286 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b295c1b_fc2b_4e58_9175_992ce31b3a3c.slice/crio-51573cfc281c7b65f93652e8b67062481368d415ef2e81493977267a28cc7516 WatchSource:0}: Error finding container 51573cfc281c7b65f93652e8b67062481368d415ef2e81493977267a28cc7516: Status 404 returned error can't find the container with id 51573cfc281c7b65f93652e8b67062481368d415ef2e81493977267a28cc7516 Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.080329 4776 generic.go:334] "Generic (PLEG): container finished" podID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerID="ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca" exitCode=0 Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.080391 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c","Type":"ContainerDied","Data":"ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca"} Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.080449 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.080475 4776 scope.go:117] "RemoveContainer" containerID="ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.080456 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ba805ac-f1c5-4049-bb56-7dfff0ccc76c","Type":"ContainerDied","Data":"d2a6fd28d05f34761d54c8f7c42b167f3b2a5714d40961f574d3c690bc14b951"} Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.125991 4776 scope.go:117] "RemoveContainer" containerID="e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.127874 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.145098 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.154954 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:04:03 crc kubenswrapper[4776]: E1204 10:04:03.155431 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="rabbitmq" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.155453 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="rabbitmq" Dec 04 10:04:03 crc kubenswrapper[4776]: E1204 10:04:03.155475 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="setup-container" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.155483 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="setup-container" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.155720 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" containerName="rabbitmq" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.157075 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.164087 4776 scope.go:117] "RemoveContainer" containerID="ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.165379 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.165662 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.165872 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.166028 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ng7tx" Dec 04 10:04:03 crc kubenswrapper[4776]: E1204 10:04:03.166272 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca\": container with ID starting with ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca not found: ID does not exist" containerID="ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.166313 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca"} err="failed to get container status \"ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca\": rpc error: code = NotFound desc = could not find container \"ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca\": container with ID starting with ac93e3b00fe9aa384a31c981b7cd010db59a5e4a593c0ae1cfe67d3f462327ca not found: ID does not exist" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.166344 4776 scope.go:117] "RemoveContainer" containerID="e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.166519 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.166680 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.166854 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 10:04:03 crc kubenswrapper[4776]: E1204 10:04:03.168306 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a\": container with ID starting with e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a not found: ID does not exist" containerID="e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.168345 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a"} err="failed to get container status \"e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a\": rpc error: code = NotFound desc = could not find container \"e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a\": container with ID starting with e6997a5028e360b37cee78c19260cfa8ab0bf6e191c0066aff520cf03484ca6a not found: ID does not exist" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.190146 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279224 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54msd\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-kube-api-access-54msd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279626 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279666 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279699 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279727 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279762 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279789 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279840 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279882 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.279942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.381444 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.381505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.381537 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.381606 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.381631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54msd\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-kube-api-access-54msd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.381663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.382137 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.382764 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.383043 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.383417 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.383524 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.383615 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.383701 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.384498 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.384837 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.385338 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.386160 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.387044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.388037 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.388380 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.390505 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.402620 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54msd\" (UniqueName: \"kubernetes.io/projected/3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f-kube-api-access-54msd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.419816 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.465786 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba805ac-f1c5-4049-bb56-7dfff0ccc76c" path="/var/lib/kubelet/pods/0ba805ac-f1c5-4049-bb56-7dfff0ccc76c/volumes" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.466998 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1b8bd1-3c18-4127-bb66-a3f99b106b8e" path="/var/lib/kubelet/pods/1b1b8bd1-3c18-4127-bb66-a3f99b106b8e/volumes" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.489325 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:03 crc kubenswrapper[4776]: I1204 10:04:03.972599 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 10:04:04 crc kubenswrapper[4776]: I1204 10:04:04.091892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f","Type":"ContainerStarted","Data":"44d9e1bc3a9cb1bbd607e4faa05b5b9dacccbb0aec2e68586a489ee1617149b8"} Dec 04 10:04:04 crc kubenswrapper[4776]: I1204 10:04:04.093992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b295c1b-fc2b-4e58-9175-992ce31b3a3c","Type":"ContainerStarted","Data":"51573cfc281c7b65f93652e8b67062481368d415ef2e81493977267a28cc7516"} Dec 04 10:04:05 crc kubenswrapper[4776]: I1204 10:04:05.106264 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b295c1b-fc2b-4e58-9175-992ce31b3a3c","Type":"ContainerStarted","Data":"5462606cfe6c77f71decd2ec3e6419bffad90b3b3a5cc0c31ebaceb7fbf57313"} Dec 04 10:04:06 crc kubenswrapper[4776]: I1204 10:04:06.119807 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f","Type":"ContainerStarted","Data":"98b4d5a063b6b3ae10741623277bb0709f43169c4f74ec5402ac1d0f6310bc22"} Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.505867 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-hwd2h"] Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.507776 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.509991 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.516929 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-hwd2h"] Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.655354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.655751 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.655831 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.656049 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.656098 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-config\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.656140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s97hr\" (UniqueName: \"kubernetes.io/projected/6b3106c4-3cbc-4add-a64c-62155b44bca4-kube-api-access-s97hr\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.758662 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.758810 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.759796 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.759857 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.760685 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.760798 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-config\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.760903 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s97hr\" (UniqueName: \"kubernetes.io/projected/6b3106c4-3cbc-4add-a64c-62155b44bca4-kube-api-access-s97hr\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.761037 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.761449 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-config\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.762148 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.762212 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.780950 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s97hr\" (UniqueName: \"kubernetes.io/projected/6b3106c4-3cbc-4add-a64c-62155b44bca4-kube-api-access-s97hr\") pod \"dnsmasq-dns-6447ccbd8f-hwd2h\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:07 crc kubenswrapper[4776]: I1204 10:04:07.826624 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:08 crc kubenswrapper[4776]: I1204 10:04:08.315118 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-hwd2h"] Dec 04 10:04:09 crc kubenswrapper[4776]: I1204 10:04:09.154592 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerID="51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d" exitCode=0 Dec 04 10:04:09 crc kubenswrapper[4776]: I1204 10:04:09.154713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" event={"ID":"6b3106c4-3cbc-4add-a64c-62155b44bca4","Type":"ContainerDied","Data":"51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d"} Dec 04 10:04:09 crc kubenswrapper[4776]: I1204 10:04:09.155036 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" event={"ID":"6b3106c4-3cbc-4add-a64c-62155b44bca4","Type":"ContainerStarted","Data":"465fa9d2595965147cc54e9b4cd6244f0da5cbceafbcf9340348156c27a470c4"} Dec 04 10:04:10 crc kubenswrapper[4776]: I1204 10:04:10.166067 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" event={"ID":"6b3106c4-3cbc-4add-a64c-62155b44bca4","Type":"ContainerStarted","Data":"c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7"} Dec 04 10:04:10 crc kubenswrapper[4776]: I1204 10:04:10.166598 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:10 crc kubenswrapper[4776]: I1204 10:04:10.190438 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" podStartSLOduration=3.190417422 podStartE2EDuration="3.190417422s" podCreationTimestamp="2025-12-04 10:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:10.183142423 +0000 UTC m=+1495.049622810" watchObservedRunningTime="2025-12-04 10:04:10.190417422 +0000 UTC m=+1495.056897799" Dec 04 10:04:17 crc kubenswrapper[4776]: I1204 10:04:17.828071 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:17 crc kubenswrapper[4776]: I1204 10:04:17.887827 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8n47z"] Dec 04 10:04:17 crc kubenswrapper[4776]: I1204 10:04:17.888112 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerName="dnsmasq-dns" containerID="cri-o://9c64e5e0c59887dc510d9dbe5d08cd9211d4d597f7ee07a7985ad35268186447" gracePeriod=10 Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.063222 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-7tsd6"] Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.065218 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.076249 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-7tsd6"] Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.227456 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.227582 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.227613 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-config\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.227674 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.227707 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.227731 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm4b\" (UniqueName: \"kubernetes.io/projected/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-kube-api-access-7wm4b\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.241614 4776 generic.go:334] "Generic (PLEG): container finished" podID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerID="9c64e5e0c59887dc510d9dbe5d08cd9211d4d597f7ee07a7985ad35268186447" exitCode=0 Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.241683 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" event={"ID":"1a99db84-a067-4f3b-ae24-7f59633187d1","Type":"ContainerDied","Data":"9c64e5e0c59887dc510d9dbe5d08cd9211d4d597f7ee07a7985ad35268186447"} Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.330270 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.330344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-config\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.330429 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.330473 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.330496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm4b\" (UniqueName: \"kubernetes.io/projected/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-kube-api-access-7wm4b\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.330531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.331708 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.333022 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.333556 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-config\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.334179 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.334661 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.363330 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm4b\" (UniqueName: \"kubernetes.io/projected/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-kube-api-access-7wm4b\") pod \"dnsmasq-dns-864d5fc68c-7tsd6\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.411752 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.498994 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.635547 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-sb\") pod \"1a99db84-a067-4f3b-ae24-7f59633187d1\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.635708 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-dns-svc\") pod \"1a99db84-a067-4f3b-ae24-7f59633187d1\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.635868 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-config\") pod \"1a99db84-a067-4f3b-ae24-7f59633187d1\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.636110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-nb\") pod \"1a99db84-a067-4f3b-ae24-7f59633187d1\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.636280 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qt9\" (UniqueName: \"kubernetes.io/projected/1a99db84-a067-4f3b-ae24-7f59633187d1-kube-api-access-r7qt9\") pod \"1a99db84-a067-4f3b-ae24-7f59633187d1\" (UID: \"1a99db84-a067-4f3b-ae24-7f59633187d1\") " Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.659936 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a99db84-a067-4f3b-ae24-7f59633187d1-kube-api-access-r7qt9" (OuterVolumeSpecName: "kube-api-access-r7qt9") pod "1a99db84-a067-4f3b-ae24-7f59633187d1" (UID: "1a99db84-a067-4f3b-ae24-7f59633187d1"). InnerVolumeSpecName "kube-api-access-r7qt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.692886 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-config" (OuterVolumeSpecName: "config") pod "1a99db84-a067-4f3b-ae24-7f59633187d1" (UID: "1a99db84-a067-4f3b-ae24-7f59633187d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.699584 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a99db84-a067-4f3b-ae24-7f59633187d1" (UID: "1a99db84-a067-4f3b-ae24-7f59633187d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.704122 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a99db84-a067-4f3b-ae24-7f59633187d1" (UID: "1a99db84-a067-4f3b-ae24-7f59633187d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.704529 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a99db84-a067-4f3b-ae24-7f59633187d1" (UID: "1a99db84-a067-4f3b-ae24-7f59633187d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.739719 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.739760 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7qt9\" (UniqueName: \"kubernetes.io/projected/1a99db84-a067-4f3b-ae24-7f59633187d1-kube-api-access-r7qt9\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.739773 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.739784 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.739793 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a99db84-a067-4f3b-ae24-7f59633187d1-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:18 crc kubenswrapper[4776]: I1204 10:04:18.894703 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-7tsd6"] Dec 04 10:04:18 crc kubenswrapper[4776]: W1204 10:04:18.896926 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda52c3d7a_7923_433a_b4ee_3ab8242fe04c.slice/crio-81995df34ed68ed37a586e49f707d6ff5d53685fddf9d3aa79131fe04ab9783b WatchSource:0}: Error finding container 81995df34ed68ed37a586e49f707d6ff5d53685fddf9d3aa79131fe04ab9783b: Status 404 returned error can't find the container with id 81995df34ed68ed37a586e49f707d6ff5d53685fddf9d3aa79131fe04ab9783b Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.252795 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" event={"ID":"1a99db84-a067-4f3b-ae24-7f59633187d1","Type":"ContainerDied","Data":"84f6368f1557d33c1748e96e4ac02e02129b80e0b43dd53636433edc60a3e9f8"} Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.252842 4776 scope.go:117] "RemoveContainer" containerID="9c64e5e0c59887dc510d9dbe5d08cd9211d4d597f7ee07a7985ad35268186447" Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.252963 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8n47z" Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.259706 4776 generic.go:334] "Generic (PLEG): container finished" podID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerID="9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230" exitCode=0 Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.259752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" event={"ID":"a52c3d7a-7923-433a-b4ee-3ab8242fe04c","Type":"ContainerDied","Data":"9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230"} Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.259784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" event={"ID":"a52c3d7a-7923-433a-b4ee-3ab8242fe04c","Type":"ContainerStarted","Data":"81995df34ed68ed37a586e49f707d6ff5d53685fddf9d3aa79131fe04ab9783b"} Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.287711 4776 scope.go:117] "RemoveContainer" containerID="f4e8b2c83a32e764c566f3bad35e9c10317ad406edb0eede1de8f67909d877a8" Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.315487 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8n47z"] Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.328245 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8n47z"] Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.380344 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.380409 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:04:19 crc kubenswrapper[4776]: I1204 10:04:19.465591 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" path="/var/lib/kubelet/pods/1a99db84-a067-4f3b-ae24-7f59633187d1/volumes" Dec 04 10:04:20 crc kubenswrapper[4776]: I1204 10:04:20.272971 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" event={"ID":"a52c3d7a-7923-433a-b4ee-3ab8242fe04c","Type":"ContainerStarted","Data":"af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff"} Dec 04 10:04:20 crc kubenswrapper[4776]: I1204 10:04:20.292467 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" podStartSLOduration=2.292449043 podStartE2EDuration="2.292449043s" podCreationTimestamp="2025-12-04 10:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:20.291117482 +0000 UTC m=+1505.157597869" watchObservedRunningTime="2025-12-04 10:04:20.292449043 +0000 UTC m=+1505.158929420" Dec 04 10:04:21 crc kubenswrapper[4776]: I1204 10:04:21.282006 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:28 crc kubenswrapper[4776]: I1204 10:04:28.414212 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:04:28 crc kubenswrapper[4776]: I1204 10:04:28.493685 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-hwd2h"] Dec 04 10:04:28 crc kubenswrapper[4776]: I1204 10:04:28.494229 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerName="dnsmasq-dns" containerID="cri-o://c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7" gracePeriod=10 Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.041468 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.189600 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-nb\") pod \"6b3106c4-3cbc-4add-a64c-62155b44bca4\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.189704 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s97hr\" (UniqueName: \"kubernetes.io/projected/6b3106c4-3cbc-4add-a64c-62155b44bca4-kube-api-access-s97hr\") pod \"6b3106c4-3cbc-4add-a64c-62155b44bca4\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.189751 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-sb\") pod \"6b3106c4-3cbc-4add-a64c-62155b44bca4\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.189858 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-config\") pod \"6b3106c4-3cbc-4add-a64c-62155b44bca4\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.190050 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-openstack-edpm-ipam\") pod \"6b3106c4-3cbc-4add-a64c-62155b44bca4\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.190092 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-dns-svc\") pod \"6b3106c4-3cbc-4add-a64c-62155b44bca4\" (UID: \"6b3106c4-3cbc-4add-a64c-62155b44bca4\") " Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.196371 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3106c4-3cbc-4add-a64c-62155b44bca4-kube-api-access-s97hr" (OuterVolumeSpecName: "kube-api-access-s97hr") pod "6b3106c4-3cbc-4add-a64c-62155b44bca4" (UID: "6b3106c4-3cbc-4add-a64c-62155b44bca4"). InnerVolumeSpecName "kube-api-access-s97hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.243439 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b3106c4-3cbc-4add-a64c-62155b44bca4" (UID: "6b3106c4-3cbc-4add-a64c-62155b44bca4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.247202 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-config" (OuterVolumeSpecName: "config") pod "6b3106c4-3cbc-4add-a64c-62155b44bca4" (UID: "6b3106c4-3cbc-4add-a64c-62155b44bca4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.249437 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6b3106c4-3cbc-4add-a64c-62155b44bca4" (UID: "6b3106c4-3cbc-4add-a64c-62155b44bca4"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.260369 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b3106c4-3cbc-4add-a64c-62155b44bca4" (UID: "6b3106c4-3cbc-4add-a64c-62155b44bca4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.268229 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b3106c4-3cbc-4add-a64c-62155b44bca4" (UID: "6b3106c4-3cbc-4add-a64c-62155b44bca4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.292101 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.292146 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.292158 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.292168 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s97hr\" (UniqueName: \"kubernetes.io/projected/6b3106c4-3cbc-4add-a64c-62155b44bca4-kube-api-access-s97hr\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.292179 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.292189 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3106c4-3cbc-4add-a64c-62155b44bca4-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.352102 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerID="c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7" exitCode=0 Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.352148 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.352174 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" event={"ID":"6b3106c4-3cbc-4add-a64c-62155b44bca4","Type":"ContainerDied","Data":"c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7"} Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.352215 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-hwd2h" event={"ID":"6b3106c4-3cbc-4add-a64c-62155b44bca4","Type":"ContainerDied","Data":"465fa9d2595965147cc54e9b4cd6244f0da5cbceafbcf9340348156c27a470c4"} Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.352251 4776 scope.go:117] "RemoveContainer" containerID="c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.375644 4776 scope.go:117] "RemoveContainer" containerID="51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.395001 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-hwd2h"] Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.399201 4776 scope.go:117] "RemoveContainer" containerID="c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7" Dec 04 10:04:29 crc kubenswrapper[4776]: E1204 10:04:29.399909 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7\": container with ID starting with c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7 not found: ID does not exist" containerID="c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.400047 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7"} err="failed to get container status \"c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7\": rpc error: code = NotFound desc = could not find container \"c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7\": container with ID starting with c5cb7c030689efe152dd2705c3600a4ab0fdfed1a23bd3986c69a013d757e3d7 not found: ID does not exist" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.400156 4776 scope.go:117] "RemoveContainer" containerID="51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d" Dec 04 10:04:29 crc kubenswrapper[4776]: E1204 10:04:29.400467 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d\": container with ID starting with 51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d not found: ID does not exist" containerID="51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.400495 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d"} err="failed to get container status \"51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d\": rpc error: code = NotFound desc = could not find container \"51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d\": container with ID starting with 51d1a20c4b18c2863dff944806727b43b987ed8addcd7b94f53b90b5c554a77d not found: ID does not exist" Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.406772 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-hwd2h"] Dec 04 10:04:29 crc kubenswrapper[4776]: I1204 10:04:29.463184 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" path="/var/lib/kubelet/pods/6b3106c4-3cbc-4add-a64c-62155b44bca4/volumes" Dec 04 10:04:37 crc kubenswrapper[4776]: I1204 10:04:37.429827 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b295c1b-fc2b-4e58-9175-992ce31b3a3c" containerID="5462606cfe6c77f71decd2ec3e6419bffad90b3b3a5cc0c31ebaceb7fbf57313" exitCode=0 Dec 04 10:04:37 crc kubenswrapper[4776]: I1204 10:04:37.429939 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b295c1b-fc2b-4e58-9175-992ce31b3a3c","Type":"ContainerDied","Data":"5462606cfe6c77f71decd2ec3e6419bffad90b3b3a5cc0c31ebaceb7fbf57313"} Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.442590 4776 generic.go:334] "Generic (PLEG): container finished" podID="3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f" containerID="98b4d5a063b6b3ae10741623277bb0709f43169c4f74ec5402ac1d0f6310bc22" exitCode=0 Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.442691 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f","Type":"ContainerDied","Data":"98b4d5a063b6b3ae10741623277bb0709f43169c4f74ec5402ac1d0f6310bc22"} Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.447466 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b295c1b-fc2b-4e58-9175-992ce31b3a3c","Type":"ContainerStarted","Data":"15fa6855d8f60ee85b1add68b53ae1a1607e2e91c8f2ef68384cc6433b63af92"} Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.447672 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.521609 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.521592868 podStartE2EDuration="36.521592868s" podCreationTimestamp="2025-12-04 10:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:38.518626155 +0000 UTC m=+1523.385106542" watchObservedRunningTime="2025-12-04 10:04:38.521592868 +0000 UTC m=+1523.388073245" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.751628 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn"] Dec 04 10:04:38 crc kubenswrapper[4776]: E1204 10:04:38.752120 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerName="init" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.752143 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerName="init" Dec 04 10:04:38 crc kubenswrapper[4776]: E1204 10:04:38.752168 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerName="dnsmasq-dns" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.752176 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerName="dnsmasq-dns" Dec 04 10:04:38 crc kubenswrapper[4776]: E1204 10:04:38.752199 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerName="init" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.752207 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerName="init" Dec 04 10:04:38 crc kubenswrapper[4776]: E1204 10:04:38.752234 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerName="dnsmasq-dns" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.752242 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerName="dnsmasq-dns" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.752461 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3106c4-3cbc-4add-a64c-62155b44bca4" containerName="dnsmasq-dns" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.752482 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a99db84-a067-4f3b-ae24-7f59633187d1" containerName="dnsmasq-dns" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.753276 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.756371 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.756405 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.756667 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.756876 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.773310 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn"] Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.861306 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.861360 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.861495 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k444p\" (UniqueName: \"kubernetes.io/projected/d3421cf4-b5ee-45c2-a236-02d6ae85f719-kube-api-access-k444p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.861546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.963062 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.963127 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.963227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k444p\" (UniqueName: \"kubernetes.io/projected/d3421cf4-b5ee-45c2-a236-02d6ae85f719-kube-api-access-k444p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.963255 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.968578 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.969090 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.975955 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:38 crc kubenswrapper[4776]: I1204 10:04:38.988230 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k444p\" (UniqueName: \"kubernetes.io/projected/d3421cf4-b5ee-45c2-a236-02d6ae85f719-kube-api-access-k444p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-494qn\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:39 crc kubenswrapper[4776]: I1204 10:04:39.068987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:04:39 crc kubenswrapper[4776]: I1204 10:04:39.483690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f","Type":"ContainerStarted","Data":"e9e7e5fcf74ae7bd0d4c4415b6cd8c240f961b0750090c7a6305835adca23bc5"} Dec 04 10:04:39 crc kubenswrapper[4776]: I1204 10:04:39.484762 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:04:39 crc kubenswrapper[4776]: I1204 10:04:39.522547 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.522522998 podStartE2EDuration="36.522522998s" podCreationTimestamp="2025-12-04 10:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:04:39.511954147 +0000 UTC m=+1524.378434554" watchObservedRunningTime="2025-12-04 10:04:39.522522998 +0000 UTC m=+1524.389003375" Dec 04 10:04:39 crc kubenswrapper[4776]: I1204 10:04:39.630257 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn"] Dec 04 10:04:40 crc kubenswrapper[4776]: I1204 10:04:40.509486 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" event={"ID":"d3421cf4-b5ee-45c2-a236-02d6ae85f719","Type":"ContainerStarted","Data":"0ecc49eb9455664593754bb042e101824648cab2971fe76193816f40825d5a6f"} Dec 04 10:04:49 crc kubenswrapper[4776]: I1204 10:04:49.380029 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:04:49 crc kubenswrapper[4776]: I1204 10:04:49.380697 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:04:52 crc kubenswrapper[4776]: I1204 10:04:52.527119 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 10:04:52 crc kubenswrapper[4776]: I1204 10:04:52.640676 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" event={"ID":"d3421cf4-b5ee-45c2-a236-02d6ae85f719","Type":"ContainerStarted","Data":"2d1e1295148b75dc91863a4f5ba4654a148e598567b907ca1769f7bf41e7f28c"} Dec 04 10:04:52 crc kubenswrapper[4776]: I1204 10:04:52.670518 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" podStartSLOduration=2.678586181 podStartE2EDuration="14.670496304s" podCreationTimestamp="2025-12-04 10:04:38 +0000 UTC" firstStartedPulling="2025-12-04 10:04:39.62310113 +0000 UTC m=+1524.489581507" lastFinishedPulling="2025-12-04 10:04:51.615011253 +0000 UTC m=+1536.481491630" observedRunningTime="2025-12-04 10:04:52.664423513 +0000 UTC m=+1537.530903890" watchObservedRunningTime="2025-12-04 10:04:52.670496304 +0000 UTC m=+1537.536976681" Dec 04 10:04:53 crc kubenswrapper[4776]: I1204 10:04:53.492260 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.441250 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2m4j"] Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.443743 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.465124 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2m4j"] Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.568742 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmdst\" (UniqueName: \"kubernetes.io/projected/515b6f83-1bdf-4d83-a87a-29a220dce432-kube-api-access-lmdst\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.568793 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-utilities\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.569428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-catalog-content\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.672113 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-catalog-content\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.672214 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmdst\" (UniqueName: \"kubernetes.io/projected/515b6f83-1bdf-4d83-a87a-29a220dce432-kube-api-access-lmdst\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.672239 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-utilities\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.672724 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-catalog-content\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.672815 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-utilities\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.693688 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmdst\" (UniqueName: \"kubernetes.io/projected/515b6f83-1bdf-4d83-a87a-29a220dce432-kube-api-access-lmdst\") pod \"community-operators-q2m4j\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:01 crc kubenswrapper[4776]: I1204 10:05:01.770366 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:02 crc kubenswrapper[4776]: I1204 10:05:02.325242 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2m4j"] Dec 04 10:05:02 crc kubenswrapper[4776]: I1204 10:05:02.742518 4776 generic.go:334] "Generic (PLEG): container finished" podID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerID="6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b" exitCode=0 Dec 04 10:05:02 crc kubenswrapper[4776]: I1204 10:05:02.742624 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2m4j" event={"ID":"515b6f83-1bdf-4d83-a87a-29a220dce432","Type":"ContainerDied","Data":"6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b"} Dec 04 10:05:02 crc kubenswrapper[4776]: I1204 10:05:02.742872 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2m4j" event={"ID":"515b6f83-1bdf-4d83-a87a-29a220dce432","Type":"ContainerStarted","Data":"ca13c70bc41dc1f66f69b095986368d33b04f53ef6dcd5a1a8bd8546a355fff4"} Dec 04 10:05:03 crc kubenswrapper[4776]: I1204 10:05:03.752410 4776 generic.go:334] "Generic (PLEG): container finished" podID="d3421cf4-b5ee-45c2-a236-02d6ae85f719" containerID="2d1e1295148b75dc91863a4f5ba4654a148e598567b907ca1769f7bf41e7f28c" exitCode=0 Dec 04 10:05:03 crc kubenswrapper[4776]: I1204 10:05:03.752499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" event={"ID":"d3421cf4-b5ee-45c2-a236-02d6ae85f719","Type":"ContainerDied","Data":"2d1e1295148b75dc91863a4f5ba4654a148e598567b907ca1769f7bf41e7f28c"} Dec 04 10:05:04 crc kubenswrapper[4776]: I1204 10:05:04.764053 4776 generic.go:334] "Generic (PLEG): container finished" podID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerID="f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091" exitCode=0 Dec 04 10:05:04 crc kubenswrapper[4776]: I1204 10:05:04.764257 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2m4j" event={"ID":"515b6f83-1bdf-4d83-a87a-29a220dce432","Type":"ContainerDied","Data":"f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091"} Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.208136 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.342152 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-repo-setup-combined-ca-bundle\") pod \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.342447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-inventory\") pod \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.342523 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-ssh-key\") pod \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.342558 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k444p\" (UniqueName: \"kubernetes.io/projected/d3421cf4-b5ee-45c2-a236-02d6ae85f719-kube-api-access-k444p\") pod \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\" (UID: \"d3421cf4-b5ee-45c2-a236-02d6ae85f719\") " Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.349202 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3421cf4-b5ee-45c2-a236-02d6ae85f719-kube-api-access-k444p" (OuterVolumeSpecName: "kube-api-access-k444p") pod "d3421cf4-b5ee-45c2-a236-02d6ae85f719" (UID: "d3421cf4-b5ee-45c2-a236-02d6ae85f719"). InnerVolumeSpecName "kube-api-access-k444p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.350181 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d3421cf4-b5ee-45c2-a236-02d6ae85f719" (UID: "d3421cf4-b5ee-45c2-a236-02d6ae85f719"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.375822 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3421cf4-b5ee-45c2-a236-02d6ae85f719" (UID: "d3421cf4-b5ee-45c2-a236-02d6ae85f719"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.384731 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-inventory" (OuterVolumeSpecName: "inventory") pod "d3421cf4-b5ee-45c2-a236-02d6ae85f719" (UID: "d3421cf4-b5ee-45c2-a236-02d6ae85f719"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.444835 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.444872 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.444887 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k444p\" (UniqueName: \"kubernetes.io/projected/d3421cf4-b5ee-45c2-a236-02d6ae85f719-kube-api-access-k444p\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.444902 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3421cf4-b5ee-45c2-a236-02d6ae85f719-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.776592 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" event={"ID":"d3421cf4-b5ee-45c2-a236-02d6ae85f719","Type":"ContainerDied","Data":"0ecc49eb9455664593754bb042e101824648cab2971fe76193816f40825d5a6f"} Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.776632 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecc49eb9455664593754bb042e101824648cab2971fe76193816f40825d5a6f" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.776689 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.781859 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2m4j" event={"ID":"515b6f83-1bdf-4d83-a87a-29a220dce432","Type":"ContainerStarted","Data":"2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c"} Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.831763 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2m4j" podStartSLOduration=2.38279693 podStartE2EDuration="4.831737074s" podCreationTimestamp="2025-12-04 10:05:01 +0000 UTC" firstStartedPulling="2025-12-04 10:05:02.744399552 +0000 UTC m=+1547.610879929" lastFinishedPulling="2025-12-04 10:05:05.193339696 +0000 UTC m=+1550.059820073" observedRunningTime="2025-12-04 10:05:05.804247982 +0000 UTC m=+1550.670728359" watchObservedRunningTime="2025-12-04 10:05:05.831737074 +0000 UTC m=+1550.698217451" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.844975 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2"] Dec 04 10:05:05 crc kubenswrapper[4776]: E1204 10:05:05.845488 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3421cf4-b5ee-45c2-a236-02d6ae85f719" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.845510 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3421cf4-b5ee-45c2-a236-02d6ae85f719" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.845695 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3421cf4-b5ee-45c2-a236-02d6ae85f719" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.846273 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.852002 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.852463 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.852621 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.854266 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.896890 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2"] Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.957909 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.958075 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.958112 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfkcl\" (UniqueName: \"kubernetes.io/projected/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-kube-api-access-jfkcl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:05 crc kubenswrapper[4776]: I1204 10:05:05.958198 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.060087 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.060144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfkcl\" (UniqueName: \"kubernetes.io/projected/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-kube-api-access-jfkcl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.060203 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.060296 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.064338 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.064555 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.064575 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.078533 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfkcl\" (UniqueName: \"kubernetes.io/projected/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-kube-api-access-jfkcl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.167485 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.684441 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2"] Dec 04 10:05:06 crc kubenswrapper[4776]: I1204 10:05:06.790638 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" event={"ID":"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb","Type":"ContainerStarted","Data":"b75f389dc7d15fbc7adcd1f4cda2e33c8de1c133ed8a7eb228441e09efd71221"} Dec 04 10:05:07 crc kubenswrapper[4776]: I1204 10:05:07.802193 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" event={"ID":"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb","Type":"ContainerStarted","Data":"bef53880ab3e659ae3ee6f5191921c88415b9a76ebec0c7859d274b0dcb7cfbd"} Dec 04 10:05:07 crc kubenswrapper[4776]: I1204 10:05:07.824594 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" podStartSLOduration=2.250972145 podStartE2EDuration="2.824576202s" podCreationTimestamp="2025-12-04 10:05:05 +0000 UTC" firstStartedPulling="2025-12-04 10:05:06.696387594 +0000 UTC m=+1551.562867971" lastFinishedPulling="2025-12-04 10:05:07.269991651 +0000 UTC m=+1552.136472028" observedRunningTime="2025-12-04 10:05:07.821802855 +0000 UTC m=+1552.688283232" watchObservedRunningTime="2025-12-04 10:05:07.824576202 +0000 UTC m=+1552.691056579" Dec 04 10:05:11 crc kubenswrapper[4776]: I1204 10:05:11.770525 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:11 crc kubenswrapper[4776]: I1204 10:05:11.771278 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:11 crc kubenswrapper[4776]: I1204 10:05:11.817627 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:11 crc kubenswrapper[4776]: I1204 10:05:11.892534 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:12 crc kubenswrapper[4776]: I1204 10:05:12.053331 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2m4j"] Dec 04 10:05:13 crc kubenswrapper[4776]: I1204 10:05:13.867191 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2m4j" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="registry-server" containerID="cri-o://2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c" gracePeriod=2 Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.345809 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.438024 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmdst\" (UniqueName: \"kubernetes.io/projected/515b6f83-1bdf-4d83-a87a-29a220dce432-kube-api-access-lmdst\") pod \"515b6f83-1bdf-4d83-a87a-29a220dce432\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.438151 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-catalog-content\") pod \"515b6f83-1bdf-4d83-a87a-29a220dce432\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.438297 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-utilities\") pod \"515b6f83-1bdf-4d83-a87a-29a220dce432\" (UID: \"515b6f83-1bdf-4d83-a87a-29a220dce432\") " Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.439160 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-utilities" (OuterVolumeSpecName: "utilities") pod "515b6f83-1bdf-4d83-a87a-29a220dce432" (UID: "515b6f83-1bdf-4d83-a87a-29a220dce432"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.446198 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515b6f83-1bdf-4d83-a87a-29a220dce432-kube-api-access-lmdst" (OuterVolumeSpecName: "kube-api-access-lmdst") pod "515b6f83-1bdf-4d83-a87a-29a220dce432" (UID: "515b6f83-1bdf-4d83-a87a-29a220dce432"). InnerVolumeSpecName "kube-api-access-lmdst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.505305 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "515b6f83-1bdf-4d83-a87a-29a220dce432" (UID: "515b6f83-1bdf-4d83-a87a-29a220dce432"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.546782 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.546833 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmdst\" (UniqueName: \"kubernetes.io/projected/515b6f83-1bdf-4d83-a87a-29a220dce432-kube-api-access-lmdst\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.546851 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515b6f83-1bdf-4d83-a87a-29a220dce432-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.878932 4776 generic.go:334] "Generic (PLEG): container finished" podID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerID="2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c" exitCode=0 Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.879271 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2m4j" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.879293 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2m4j" event={"ID":"515b6f83-1bdf-4d83-a87a-29a220dce432","Type":"ContainerDied","Data":"2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c"} Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.879329 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2m4j" event={"ID":"515b6f83-1bdf-4d83-a87a-29a220dce432","Type":"ContainerDied","Data":"ca13c70bc41dc1f66f69b095986368d33b04f53ef6dcd5a1a8bd8546a355fff4"} Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.879351 4776 scope.go:117] "RemoveContainer" containerID="2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.906155 4776 scope.go:117] "RemoveContainer" containerID="f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.921873 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2m4j"] Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.931517 4776 scope.go:117] "RemoveContainer" containerID="6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.933939 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2m4j"] Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.970834 4776 scope.go:117] "RemoveContainer" containerID="2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c" Dec 04 10:05:14 crc kubenswrapper[4776]: E1204 10:05:14.971409 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c\": container with ID starting with 2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c not found: ID does not exist" containerID="2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.971462 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c"} err="failed to get container status \"2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c\": rpc error: code = NotFound desc = could not find container \"2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c\": container with ID starting with 2c214cbcbd34c58f99d9db3e459f7390eb7b906789c7e3e9c72c68c66ed7626c not found: ID does not exist" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.971492 4776 scope.go:117] "RemoveContainer" containerID="f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091" Dec 04 10:05:14 crc kubenswrapper[4776]: E1204 10:05:14.971972 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091\": container with ID starting with f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091 not found: ID does not exist" containerID="f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.972023 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091"} err="failed to get container status \"f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091\": rpc error: code = NotFound desc = could not find container \"f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091\": container with ID starting with f89ca669cf459a8cc89b467f9abf82e0e704fe4bf1107bc5b5cdc641cefc7091 not found: ID does not exist" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.972046 4776 scope.go:117] "RemoveContainer" containerID="6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b" Dec 04 10:05:14 crc kubenswrapper[4776]: E1204 10:05:14.972397 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b\": container with ID starting with 6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b not found: ID does not exist" containerID="6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b" Dec 04 10:05:14 crc kubenswrapper[4776]: I1204 10:05:14.972447 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b"} err="failed to get container status \"6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b\": rpc error: code = NotFound desc = could not find container \"6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b\": container with ID starting with 6ebcfb876a570c9f8cd1913e621f49e0cc6822a88382df32bea6d6008e2c6e4b not found: ID does not exist" Dec 04 10:05:15 crc kubenswrapper[4776]: I1204 10:05:15.467753 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" path="/var/lib/kubelet/pods/515b6f83-1bdf-4d83-a87a-29a220dce432/volumes" Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.379773 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.381158 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.381256 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.382515 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.382602 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" gracePeriod=600 Dec 04 10:05:19 crc kubenswrapper[4776]: E1204 10:05:19.517624 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.935368 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" exitCode=0 Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.935436 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96"} Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.935701 4776 scope.go:117] "RemoveContainer" containerID="5787add21e877423617566cc01fd1cd5d93ab12b7726098df3a77184a49fa270" Dec 04 10:05:19 crc kubenswrapper[4776]: I1204 10:05:19.936418 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:05:19 crc kubenswrapper[4776]: E1204 10:05:19.936750 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:05:32 crc kubenswrapper[4776]: I1204 10:05:32.452566 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:05:32 crc kubenswrapper[4776]: E1204 10:05:32.453331 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.811553 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krkwj"] Dec 04 10:05:36 crc kubenswrapper[4776]: E1204 10:05:36.812779 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="extract-content" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.812797 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="extract-content" Dec 04 10:05:36 crc kubenswrapper[4776]: E1204 10:05:36.812827 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="extract-utilities" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.812838 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="extract-utilities" Dec 04 10:05:36 crc kubenswrapper[4776]: E1204 10:05:36.812847 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="registry-server" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.812856 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="registry-server" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.813096 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="515b6f83-1bdf-4d83-a87a-29a220dce432" containerName="registry-server" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.816840 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.828331 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krkwj"] Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.997384 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-catalog-content\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.998152 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-utilities\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:36 crc kubenswrapper[4776]: I1204 10:05:36.998308 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-kube-api-access-vppr2\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.099890 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-utilities\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.099994 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-kube-api-access-vppr2\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.100106 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-catalog-content\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.100672 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-utilities\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.100751 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-catalog-content\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.134121 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-kube-api-access-vppr2\") pod \"redhat-marketplace-krkwj\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.148048 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:37 crc kubenswrapper[4776]: I1204 10:05:37.625127 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krkwj"] Dec 04 10:05:38 crc kubenswrapper[4776]: I1204 10:05:38.093829 4776 generic.go:334] "Generic (PLEG): container finished" podID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerID="f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082" exitCode=0 Dec 04 10:05:38 crc kubenswrapper[4776]: I1204 10:05:38.093940 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerDied","Data":"f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082"} Dec 04 10:05:38 crc kubenswrapper[4776]: I1204 10:05:38.094176 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerStarted","Data":"6817f28bd0faf9c2853493587fab4ffa57b1daa28ca5302c4427829969d416f8"} Dec 04 10:05:39 crc kubenswrapper[4776]: I1204 10:05:39.106596 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerStarted","Data":"0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1"} Dec 04 10:05:40 crc kubenswrapper[4776]: I1204 10:05:40.119281 4776 generic.go:334] "Generic (PLEG): container finished" podID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerID="0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1" exitCode=0 Dec 04 10:05:40 crc kubenswrapper[4776]: I1204 10:05:40.119421 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerDied","Data":"0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1"} Dec 04 10:05:41 crc kubenswrapper[4776]: I1204 10:05:41.131896 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerStarted","Data":"4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc"} Dec 04 10:05:41 crc kubenswrapper[4776]: I1204 10:05:41.168846 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krkwj" podStartSLOduration=2.652092266 podStartE2EDuration="5.168814934s" podCreationTimestamp="2025-12-04 10:05:36 +0000 UTC" firstStartedPulling="2025-12-04 10:05:38.096185274 +0000 UTC m=+1582.962665671" lastFinishedPulling="2025-12-04 10:05:40.612907962 +0000 UTC m=+1585.479388339" observedRunningTime="2025-12-04 10:05:41.16197033 +0000 UTC m=+1586.028450707" watchObservedRunningTime="2025-12-04 10:05:41.168814934 +0000 UTC m=+1586.035295311" Dec 04 10:05:46 crc kubenswrapper[4776]: I1204 10:05:46.452342 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:05:46 crc kubenswrapper[4776]: E1204 10:05:46.453164 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:05:47 crc kubenswrapper[4776]: I1204 10:05:47.148641 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:47 crc kubenswrapper[4776]: I1204 10:05:47.148719 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:47 crc kubenswrapper[4776]: I1204 10:05:47.200813 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:47 crc kubenswrapper[4776]: I1204 10:05:47.252696 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:47 crc kubenswrapper[4776]: I1204 10:05:47.439382 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krkwj"] Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.203435 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krkwj" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="registry-server" containerID="cri-o://4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc" gracePeriod=2 Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.674492 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.782938 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-catalog-content\") pod \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.783133 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-kube-api-access-vppr2\") pod \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.783172 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-utilities\") pod \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\" (UID: \"d276c5dc-b2dc-48f2-b8a5-2edca69300ce\") " Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.784139 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-utilities" (OuterVolumeSpecName: "utilities") pod "d276c5dc-b2dc-48f2-b8a5-2edca69300ce" (UID: "d276c5dc-b2dc-48f2-b8a5-2edca69300ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.788813 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-kube-api-access-vppr2" (OuterVolumeSpecName: "kube-api-access-vppr2") pod "d276c5dc-b2dc-48f2-b8a5-2edca69300ce" (UID: "d276c5dc-b2dc-48f2-b8a5-2edca69300ce"). InnerVolumeSpecName "kube-api-access-vppr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.805260 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d276c5dc-b2dc-48f2-b8a5-2edca69300ce" (UID: "d276c5dc-b2dc-48f2-b8a5-2edca69300ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.885517 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.885554 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vppr2\" (UniqueName: \"kubernetes.io/projected/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-kube-api-access-vppr2\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:49 crc kubenswrapper[4776]: I1204 10:05:49.885567 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d276c5dc-b2dc-48f2-b8a5-2edca69300ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.215591 4776 generic.go:334] "Generic (PLEG): container finished" podID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerID="4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc" exitCode=0 Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.215645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerDied","Data":"4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc"} Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.215683 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krkwj" event={"ID":"d276c5dc-b2dc-48f2-b8a5-2edca69300ce","Type":"ContainerDied","Data":"6817f28bd0faf9c2853493587fab4ffa57b1daa28ca5302c4427829969d416f8"} Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.215705 4776 scope.go:117] "RemoveContainer" containerID="4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.215711 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krkwj" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.251281 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krkwj"] Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.252371 4776 scope.go:117] "RemoveContainer" containerID="0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.261046 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krkwj"] Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.273634 4776 scope.go:117] "RemoveContainer" containerID="f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.310945 4776 scope.go:117] "RemoveContainer" containerID="4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc" Dec 04 10:05:50 crc kubenswrapper[4776]: E1204 10:05:50.311692 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc\": container with ID starting with 4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc not found: ID does not exist" containerID="4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.311731 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc"} err="failed to get container status \"4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc\": rpc error: code = NotFound desc = could not find container \"4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc\": container with ID starting with 4224fa392fc802d48135c9078b0a42266402e2fbcf23875ece77657ec10cbbbc not found: ID does not exist" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.311752 4776 scope.go:117] "RemoveContainer" containerID="0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1" Dec 04 10:05:50 crc kubenswrapper[4776]: E1204 10:05:50.312041 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1\": container with ID starting with 0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1 not found: ID does not exist" containerID="0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.312073 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1"} err="failed to get container status \"0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1\": rpc error: code = NotFound desc = could not find container \"0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1\": container with ID starting with 0dedbb08880f2794a34a8e7be9907ffb90259eb1053afc35ce1fc698bbbbe0d1 not found: ID does not exist" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.312091 4776 scope.go:117] "RemoveContainer" containerID="f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082" Dec 04 10:05:50 crc kubenswrapper[4776]: E1204 10:05:50.312496 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082\": container with ID starting with f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082 not found: ID does not exist" containerID="f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082" Dec 04 10:05:50 crc kubenswrapper[4776]: I1204 10:05:50.312521 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082"} err="failed to get container status \"f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082\": rpc error: code = NotFound desc = could not find container \"f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082\": container with ID starting with f6b3e51bf89b5aaaf602739648c8aef816d52bbde8d465c70fbdaa6b38606082 not found: ID does not exist" Dec 04 10:05:51 crc kubenswrapper[4776]: I1204 10:05:51.463750 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" path="/var/lib/kubelet/pods/d276c5dc-b2dc-48f2-b8a5-2edca69300ce/volumes" Dec 04 10:05:57 crc kubenswrapper[4776]: I1204 10:05:57.453171 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:05:57 crc kubenswrapper[4776]: E1204 10:05:57.454199 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:06:10 crc kubenswrapper[4776]: I1204 10:06:10.452977 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:06:10 crc kubenswrapper[4776]: E1204 10:06:10.453865 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:06:19 crc kubenswrapper[4776]: I1204 10:06:19.806484 4776 scope.go:117] "RemoveContainer" containerID="573e315a75a0b96a36b02683cea5517fd4355e5598a9d565f98a1953a929280b" Dec 04 10:06:19 crc kubenswrapper[4776]: I1204 10:06:19.851366 4776 scope.go:117] "RemoveContainer" containerID="5a91c13bf2f6fd9c013eefb602b3f9ab04b60c36f0b1dd4440db61dff31235b8" Dec 04 10:06:19 crc kubenswrapper[4776]: I1204 10:06:19.883624 4776 scope.go:117] "RemoveContainer" containerID="8cb3c87de8875ddc8a3f922151f62c3a7acb6802f63308985a0ffda8f5f02f0e" Dec 04 10:06:22 crc kubenswrapper[4776]: I1204 10:06:22.452453 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:06:22 crc kubenswrapper[4776]: E1204 10:06:22.452997 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:06:36 crc kubenswrapper[4776]: I1204 10:06:36.452724 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:06:36 crc kubenswrapper[4776]: E1204 10:06:36.453599 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:06:51 crc kubenswrapper[4776]: I1204 10:06:51.453299 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:06:51 crc kubenswrapper[4776]: E1204 10:06:51.454293 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:07:05 crc kubenswrapper[4776]: I1204 10:07:05.462208 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:07:05 crc kubenswrapper[4776]: E1204 10:07:05.463066 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:07:18 crc kubenswrapper[4776]: I1204 10:07:18.452284 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:07:18 crc kubenswrapper[4776]: E1204 10:07:18.452983 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:07:31 crc kubenswrapper[4776]: I1204 10:07:31.452908 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:07:31 crc kubenswrapper[4776]: E1204 10:07:31.453748 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:07:42 crc kubenswrapper[4776]: I1204 10:07:42.452172 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:07:42 crc kubenswrapper[4776]: E1204 10:07:42.452881 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:07:55 crc kubenswrapper[4776]: I1204 10:07:55.462640 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:07:55 crc kubenswrapper[4776]: E1204 10:07:55.463493 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:08:06 crc kubenswrapper[4776]: I1204 10:08:06.452380 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:08:06 crc kubenswrapper[4776]: E1204 10:08:06.453094 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:08:20 crc kubenswrapper[4776]: I1204 10:08:20.452902 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:08:20 crc kubenswrapper[4776]: E1204 10:08:20.453815 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:08:31 crc kubenswrapper[4776]: I1204 10:08:31.452218 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:08:31 crc kubenswrapper[4776]: E1204 10:08:31.453112 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:08:43 crc kubenswrapper[4776]: I1204 10:08:43.453147 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:08:43 crc kubenswrapper[4776]: E1204 10:08:43.453893 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:08:46 crc kubenswrapper[4776]: E1204 10:08:46.357317 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d88dbcd_c42d_49c5_b71d_6b90735ba4fb.slice/crio-bef53880ab3e659ae3ee6f5191921c88415b9a76ebec0c7859d274b0dcb7cfbd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d88dbcd_c42d_49c5_b71d_6b90735ba4fb.slice/crio-conmon-bef53880ab3e659ae3ee6f5191921c88415b9a76ebec0c7859d274b0dcb7cfbd.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:08:46 crc kubenswrapper[4776]: I1204 10:08:46.915519 4776 generic.go:334] "Generic (PLEG): container finished" podID="4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" containerID="bef53880ab3e659ae3ee6f5191921c88415b9a76ebec0c7859d274b0dcb7cfbd" exitCode=0 Dec 04 10:08:46 crc kubenswrapper[4776]: I1204 10:08:46.915588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" event={"ID":"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb","Type":"ContainerDied","Data":"bef53880ab3e659ae3ee6f5191921c88415b9a76ebec0c7859d274b0dcb7cfbd"} Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.321754 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.404216 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-bootstrap-combined-ca-bundle\") pod \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.410308 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" (UID: "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.505985 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-ssh-key\") pod \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.506361 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfkcl\" (UniqueName: \"kubernetes.io/projected/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-kube-api-access-jfkcl\") pod \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.506633 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-inventory\") pod \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\" (UID: \"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb\") " Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.507623 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.509810 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-kube-api-access-jfkcl" (OuterVolumeSpecName: "kube-api-access-jfkcl") pod "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" (UID: "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb"). InnerVolumeSpecName "kube-api-access-jfkcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.534352 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" (UID: "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.541009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-inventory" (OuterVolumeSpecName: "inventory") pod "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" (UID: "4d88dbcd-c42d-49c5-b71d-6b90735ba4fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.608951 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.608981 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfkcl\" (UniqueName: \"kubernetes.io/projected/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-kube-api-access-jfkcl\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.608991 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.934790 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" event={"ID":"4d88dbcd-c42d-49c5-b71d-6b90735ba4fb","Type":"ContainerDied","Data":"b75f389dc7d15fbc7adcd1f4cda2e33c8de1c133ed8a7eb228441e09efd71221"} Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.934833 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b75f389dc7d15fbc7adcd1f4cda2e33c8de1c133ed8a7eb228441e09efd71221" Dec 04 10:08:48 crc kubenswrapper[4776]: I1204 10:08:48.934868 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.028378 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn"] Dec 04 10:08:49 crc kubenswrapper[4776]: E1204 10:08:49.028815 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.028836 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:08:49 crc kubenswrapper[4776]: E1204 10:08:49.028868 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="extract-utilities" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.028877 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="extract-utilities" Dec 04 10:08:49 crc kubenswrapper[4776]: E1204 10:08:49.028886 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="registry-server" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.028894 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="registry-server" Dec 04 10:08:49 crc kubenswrapper[4776]: E1204 10:08:49.028933 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="extract-content" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.028943 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="extract-content" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.029139 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d276c5dc-b2dc-48f2-b8a5-2edca69300ce" containerName="registry-server" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.029163 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.029938 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.032354 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.032470 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.032588 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.048884 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.059943 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn"] Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.282574 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gn5j\" (UniqueName: \"kubernetes.io/projected/8d0203c8-396c-4504-904c-18a82d237314-kube-api-access-6gn5j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.283042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.283078 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.384772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gn5j\" (UniqueName: \"kubernetes.io/projected/8d0203c8-396c-4504-904c-18a82d237314-kube-api-access-6gn5j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.384941 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.384965 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.396790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.397014 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.403840 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gn5j\" (UniqueName: \"kubernetes.io/projected/8d0203c8-396c-4504-904c-18a82d237314-kube-api-access-6gn5j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-knqvn\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:49 crc kubenswrapper[4776]: I1204 10:08:49.645363 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:08:50 crc kubenswrapper[4776]: I1204 10:08:50.204355 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn"] Dec 04 10:08:50 crc kubenswrapper[4776]: I1204 10:08:50.208862 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:08:50 crc kubenswrapper[4776]: I1204 10:08:50.951808 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" event={"ID":"8d0203c8-396c-4504-904c-18a82d237314","Type":"ContainerStarted","Data":"f73b9777636353f09e541a566d467454158aa7ee2bc3a628d9384a5742545ba5"} Dec 04 10:08:50 crc kubenswrapper[4776]: I1204 10:08:50.952129 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" event={"ID":"8d0203c8-396c-4504-904c-18a82d237314","Type":"ContainerStarted","Data":"6eafc35716997a797a47cdd1ed03fecd0b343a9aef25951100d7a240459da44c"} Dec 04 10:08:50 crc kubenswrapper[4776]: I1204 10:08:50.969171 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" podStartSLOduration=1.5094397659999998 podStartE2EDuration="1.969153185s" podCreationTimestamp="2025-12-04 10:08:49 +0000 UTC" firstStartedPulling="2025-12-04 10:08:50.208615996 +0000 UTC m=+1775.075096373" lastFinishedPulling="2025-12-04 10:08:50.668329415 +0000 UTC m=+1775.534809792" observedRunningTime="2025-12-04 10:08:50.967050479 +0000 UTC m=+1775.833530926" watchObservedRunningTime="2025-12-04 10:08:50.969153185 +0000 UTC m=+1775.835633562" Dec 04 10:08:54 crc kubenswrapper[4776]: I1204 10:08:54.452625 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:08:54 crc kubenswrapper[4776]: E1204 10:08:54.453374 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:09:09 crc kubenswrapper[4776]: I1204 10:09:09.453386 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:09:09 crc kubenswrapper[4776]: E1204 10:09:09.454104 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:09:17 crc kubenswrapper[4776]: I1204 10:09:17.070837 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-z6jmg"] Dec 04 10:09:17 crc kubenswrapper[4776]: I1204 10:09:17.101517 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e236-account-create-update-n8sb9"] Dec 04 10:09:17 crc kubenswrapper[4776]: I1204 10:09:17.113052 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e236-account-create-update-n8sb9"] Dec 04 10:09:17 crc kubenswrapper[4776]: I1204 10:09:17.129843 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-z6jmg"] Dec 04 10:09:17 crc kubenswrapper[4776]: I1204 10:09:17.463157 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a" path="/var/lib/kubelet/pods/9ae68b78-c26b-4177-a1bf-7b5cef7a4f4a/volumes" Dec 04 10:09:17 crc kubenswrapper[4776]: I1204 10:09:17.464126 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0796537-9ea0-42b5-9701-04487a4ca241" path="/var/lib/kubelet/pods/f0796537-9ea0-42b5-9701-04487a4ca241/volumes" Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.037728 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e717-account-create-update-qc9bj"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.046350 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jdlgz"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.059756 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e717-account-create-update-qc9bj"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.067668 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gh7dv"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.075844 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e29f-account-create-update-qpv5v"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.083421 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jdlgz"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.091830 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gh7dv"] Dec 04 10:09:18 crc kubenswrapper[4776]: I1204 10:09:18.100837 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e29f-account-create-update-qpv5v"] Dec 04 10:09:19 crc kubenswrapper[4776]: I1204 10:09:19.464015 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468626e9-c715-4f2d-bb1e-35f3ac706a17" path="/var/lib/kubelet/pods/468626e9-c715-4f2d-bb1e-35f3ac706a17/volumes" Dec 04 10:09:19 crc kubenswrapper[4776]: I1204 10:09:19.464779 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad3ed01-a668-4335-9254-46a2c1704e90" path="/var/lib/kubelet/pods/bad3ed01-a668-4335-9254-46a2c1704e90/volumes" Dec 04 10:09:19 crc kubenswrapper[4776]: I1204 10:09:19.465614 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c203a97c-dc5b-4a58-bb5c-f826221c87f3" path="/var/lib/kubelet/pods/c203a97c-dc5b-4a58-bb5c-f826221c87f3/volumes" Dec 04 10:09:19 crc kubenswrapper[4776]: I1204 10:09:19.466291 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1551b7-3a64-4400-b3b6-8b3e1334401e" path="/var/lib/kubelet/pods/fb1551b7-3a64-4400-b3b6-8b3e1334401e/volumes" Dec 04 10:09:20 crc kubenswrapper[4776]: I1204 10:09:20.020735 4776 scope.go:117] "RemoveContainer" containerID="aead5ee0ef4ce7755ac198ac9300d19e56c02ec824ec6ecf9ed70caebc744169" Dec 04 10:09:20 crc kubenswrapper[4776]: I1204 10:09:20.045165 4776 scope.go:117] "RemoveContainer" containerID="7e49031e0c831ace68a19564a059c357bd6627a28dfe316983b9b924af419fb4" Dec 04 10:09:20 crc kubenswrapper[4776]: I1204 10:09:20.089409 4776 scope.go:117] "RemoveContainer" containerID="a63949576df1f8f195a4298a3b52d20e47d84f4feddd41f7960c58ec54a73564" Dec 04 10:09:20 crc kubenswrapper[4776]: I1204 10:09:20.130450 4776 scope.go:117] "RemoveContainer" containerID="d2bac99ed10b7b300952dfcd9a61aa4b777eb14de2a46a92993167bcfdd7d941" Dec 04 10:09:20 crc kubenswrapper[4776]: I1204 10:09:20.180096 4776 scope.go:117] "RemoveContainer" containerID="503a98ebf65c564782a93d3715e1c72685e279fcdc9276605a89ae86165428e1" Dec 04 10:09:20 crc kubenswrapper[4776]: I1204 10:09:20.229606 4776 scope.go:117] "RemoveContainer" containerID="360b51b5c7bc3bcd47ecd894ebf652c2aed70e580fffa8bbc24264fd49b7bbe4" Dec 04 10:09:21 crc kubenswrapper[4776]: I1204 10:09:21.452337 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:09:21 crc kubenswrapper[4776]: E1204 10:09:21.452704 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:09:34 crc kubenswrapper[4776]: I1204 10:09:34.452200 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:09:34 crc kubenswrapper[4776]: E1204 10:09:34.452956 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:09:39 crc kubenswrapper[4776]: I1204 10:09:39.051847 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qzm7k"] Dec 04 10:09:39 crc kubenswrapper[4776]: I1204 10:09:39.063022 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qzm7k"] Dec 04 10:09:39 crc kubenswrapper[4776]: I1204 10:09:39.466379 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7507bda5-9608-4cd8-b40b-f9e69a06d41c" path="/var/lib/kubelet/pods/7507bda5-9608-4cd8-b40b-f9e69a06d41c/volumes" Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.038634 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-trxt7"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.053033 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-trxt7"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.065861 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0a78-account-create-update-77b4l"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.075694 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0a78-account-create-update-77b4l"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.085080 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c65e-account-create-update-8tkss"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.094128 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2lscm"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.102764 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-245b-account-create-update-pckpb"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.110753 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c65e-account-create-update-8tkss"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.119610 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-245b-account-create-update-pckpb"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.130363 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2lscm"] Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.465705 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7dcd00-07c0-4e29-b608-e43d0e09cfba" path="/var/lib/kubelet/pods/6c7dcd00-07c0-4e29-b608-e43d0e09cfba/volumes" Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.466808 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71086560-4754-4032-b819-35a007beb5fd" path="/var/lib/kubelet/pods/71086560-4754-4032-b819-35a007beb5fd/volumes" Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.467475 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718a3f24-cae1-449e-b979-0058c19dbe4b" path="/var/lib/kubelet/pods/718a3f24-cae1-449e-b979-0058c19dbe4b/volumes" Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.469406 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4935bc-a17c-4ded-b454-21eb494550e5" path="/var/lib/kubelet/pods/7f4935bc-a17c-4ded-b454-21eb494550e5/volumes" Dec 04 10:09:43 crc kubenswrapper[4776]: I1204 10:09:43.470085 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd69365-f194-4607-ba00-f17ed2acbdb9" path="/var/lib/kubelet/pods/dcd69365-f194-4607-ba00-f17ed2acbdb9/volumes" Dec 04 10:09:45 crc kubenswrapper[4776]: I1204 10:09:45.459197 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:09:45 crc kubenswrapper[4776]: E1204 10:09:45.459748 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:09:48 crc kubenswrapper[4776]: I1204 10:09:48.035898 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p2lqk"] Dec 04 10:09:48 crc kubenswrapper[4776]: I1204 10:09:48.045073 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6vlkr"] Dec 04 10:09:48 crc kubenswrapper[4776]: I1204 10:09:48.052869 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p2lqk"] Dec 04 10:09:48 crc kubenswrapper[4776]: I1204 10:09:48.062800 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6vlkr"] Dec 04 10:09:49 crc kubenswrapper[4776]: I1204 10:09:49.463793 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93c4643-8ac8-4063-b35f-b6de695f42dd" path="/var/lib/kubelet/pods/c93c4643-8ac8-4063-b35f-b6de695f42dd/volumes" Dec 04 10:09:49 crc kubenswrapper[4776]: I1204 10:09:49.465131 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9bffe6-0182-47a3-b5f6-86297c6f5c92" path="/var/lib/kubelet/pods/eb9bffe6-0182-47a3-b5f6-86297c6f5c92/volumes" Dec 04 10:09:56 crc kubenswrapper[4776]: I1204 10:09:56.452558 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:09:56 crc kubenswrapper[4776]: E1204 10:09:56.453468 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:10:06 crc kubenswrapper[4776]: I1204 10:10:06.642546 4776 generic.go:334] "Generic (PLEG): container finished" podID="8d0203c8-396c-4504-904c-18a82d237314" containerID="f73b9777636353f09e541a566d467454158aa7ee2bc3a628d9384a5742545ba5" exitCode=0 Dec 04 10:10:06 crc kubenswrapper[4776]: I1204 10:10:06.643089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" event={"ID":"8d0203c8-396c-4504-904c-18a82d237314","Type":"ContainerDied","Data":"f73b9777636353f09e541a566d467454158aa7ee2bc3a628d9384a5742545ba5"} Dec 04 10:10:07 crc kubenswrapper[4776]: I1204 10:10:07.452378 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:10:07 crc kubenswrapper[4776]: E1204 10:10:07.452633 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.038137 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.040289 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gn5j\" (UniqueName: \"kubernetes.io/projected/8d0203c8-396c-4504-904c-18a82d237314-kube-api-access-6gn5j\") pod \"8d0203c8-396c-4504-904c-18a82d237314\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.040358 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-inventory\") pod \"8d0203c8-396c-4504-904c-18a82d237314\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.040391 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-ssh-key\") pod \"8d0203c8-396c-4504-904c-18a82d237314\" (UID: \"8d0203c8-396c-4504-904c-18a82d237314\") " Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.046312 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0203c8-396c-4504-904c-18a82d237314-kube-api-access-6gn5j" (OuterVolumeSpecName: "kube-api-access-6gn5j") pod "8d0203c8-396c-4504-904c-18a82d237314" (UID: "8d0203c8-396c-4504-904c-18a82d237314"). InnerVolumeSpecName "kube-api-access-6gn5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.075589 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d0203c8-396c-4504-904c-18a82d237314" (UID: "8d0203c8-396c-4504-904c-18a82d237314"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.079295 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-inventory" (OuterVolumeSpecName: "inventory") pod "8d0203c8-396c-4504-904c-18a82d237314" (UID: "8d0203c8-396c-4504-904c-18a82d237314"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.142191 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gn5j\" (UniqueName: \"kubernetes.io/projected/8d0203c8-396c-4504-904c-18a82d237314-kube-api-access-6gn5j\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.142245 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.142261 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d0203c8-396c-4504-904c-18a82d237314-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.659939 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" event={"ID":"8d0203c8-396c-4504-904c-18a82d237314","Type":"ContainerDied","Data":"6eafc35716997a797a47cdd1ed03fecd0b343a9aef25951100d7a240459da44c"} Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.660246 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eafc35716997a797a47cdd1ed03fecd0b343a9aef25951100d7a240459da44c" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.659983 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.738267 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw"] Dec 04 10:10:08 crc kubenswrapper[4776]: E1204 10:10:08.738678 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0203c8-396c-4504-904c-18a82d237314" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.738696 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0203c8-396c-4504-904c-18a82d237314" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.738868 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0203c8-396c-4504-904c-18a82d237314" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.739544 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.741666 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.742113 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.742322 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.742594 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.748727 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw"] Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.855020 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6tnh\" (UniqueName: \"kubernetes.io/projected/44079ddd-2a33-421b-a2ba-359a08689df6-kube-api-access-g6tnh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.855129 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.855866 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.958174 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.958286 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6tnh\" (UniqueName: \"kubernetes.io/projected/44079ddd-2a33-421b-a2ba-359a08689df6-kube-api-access-g6tnh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.958398 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.963847 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.970609 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:08 crc kubenswrapper[4776]: I1204 10:10:08.980072 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6tnh\" (UniqueName: \"kubernetes.io/projected/44079ddd-2a33-421b-a2ba-359a08689df6-kube-api-access-g6tnh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:09 crc kubenswrapper[4776]: I1204 10:10:09.056240 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:09 crc kubenswrapper[4776]: I1204 10:10:09.610114 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw"] Dec 04 10:10:09 crc kubenswrapper[4776]: I1204 10:10:09.670331 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" event={"ID":"44079ddd-2a33-421b-a2ba-359a08689df6","Type":"ContainerStarted","Data":"a58b8f755eb776b84f29f8eea5d46c492b1a733062efa298c97fe88fee076181"} Dec 04 10:10:10 crc kubenswrapper[4776]: I1204 10:10:10.681376 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" event={"ID":"44079ddd-2a33-421b-a2ba-359a08689df6","Type":"ContainerStarted","Data":"c3e3f214157fa63d0bc680ca0d118b2f914cb195604dce268b25f913f5b55f83"} Dec 04 10:10:10 crc kubenswrapper[4776]: I1204 10:10:10.711177 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" podStartSLOduration=1.996725409 podStartE2EDuration="2.711152892s" podCreationTimestamp="2025-12-04 10:10:08 +0000 UTC" firstStartedPulling="2025-12-04 10:10:09.613035293 +0000 UTC m=+1854.479515670" lastFinishedPulling="2025-12-04 10:10:10.327462776 +0000 UTC m=+1855.193943153" observedRunningTime="2025-12-04 10:10:10.703943837 +0000 UTC m=+1855.570424214" watchObservedRunningTime="2025-12-04 10:10:10.711152892 +0000 UTC m=+1855.577633269" Dec 04 10:10:15 crc kubenswrapper[4776]: I1204 10:10:15.723426 4776 generic.go:334] "Generic (PLEG): container finished" podID="44079ddd-2a33-421b-a2ba-359a08689df6" containerID="c3e3f214157fa63d0bc680ca0d118b2f914cb195604dce268b25f913f5b55f83" exitCode=0 Dec 04 10:10:15 crc kubenswrapper[4776]: I1204 10:10:15.723992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" event={"ID":"44079ddd-2a33-421b-a2ba-359a08689df6","Type":"ContainerDied","Data":"c3e3f214157fa63d0bc680ca0d118b2f914cb195604dce268b25f913f5b55f83"} Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.120200 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.224905 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6tnh\" (UniqueName: \"kubernetes.io/projected/44079ddd-2a33-421b-a2ba-359a08689df6-kube-api-access-g6tnh\") pod \"44079ddd-2a33-421b-a2ba-359a08689df6\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.225054 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-inventory\") pod \"44079ddd-2a33-421b-a2ba-359a08689df6\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.225174 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-ssh-key\") pod \"44079ddd-2a33-421b-a2ba-359a08689df6\" (UID: \"44079ddd-2a33-421b-a2ba-359a08689df6\") " Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.230661 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44079ddd-2a33-421b-a2ba-359a08689df6-kube-api-access-g6tnh" (OuterVolumeSpecName: "kube-api-access-g6tnh") pod "44079ddd-2a33-421b-a2ba-359a08689df6" (UID: "44079ddd-2a33-421b-a2ba-359a08689df6"). InnerVolumeSpecName "kube-api-access-g6tnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.259505 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-inventory" (OuterVolumeSpecName: "inventory") pod "44079ddd-2a33-421b-a2ba-359a08689df6" (UID: "44079ddd-2a33-421b-a2ba-359a08689df6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.260258 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44079ddd-2a33-421b-a2ba-359a08689df6" (UID: "44079ddd-2a33-421b-a2ba-359a08689df6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.327531 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6tnh\" (UniqueName: \"kubernetes.io/projected/44079ddd-2a33-421b-a2ba-359a08689df6-kube-api-access-g6tnh\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.327583 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.327598 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44079ddd-2a33-421b-a2ba-359a08689df6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.740432 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" event={"ID":"44079ddd-2a33-421b-a2ba-359a08689df6","Type":"ContainerDied","Data":"a58b8f755eb776b84f29f8eea5d46c492b1a733062efa298c97fe88fee076181"} Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.740482 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58b8f755eb776b84f29f8eea5d46c492b1a733062efa298c97fe88fee076181" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.740540 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.808844 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq"] Dec 04 10:10:17 crc kubenswrapper[4776]: E1204 10:10:17.809258 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44079ddd-2a33-421b-a2ba-359a08689df6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.809276 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="44079ddd-2a33-421b-a2ba-359a08689df6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.809486 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="44079ddd-2a33-421b-a2ba-359a08689df6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.810267 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.812265 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.812460 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.814690 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.814755 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.837105 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.837392 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8sg\" (UniqueName: \"kubernetes.io/projected/29858f05-6bc6-4f33-ae7e-e65c4737eed7-kube-api-access-5l8sg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.837581 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.842985 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq"] Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.939321 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.939457 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8sg\" (UniqueName: \"kubernetes.io/projected/29858f05-6bc6-4f33-ae7e-e65c4737eed7-kube-api-access-5l8sg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.939549 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.944055 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.944343 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:17 crc kubenswrapper[4776]: I1204 10:10:17.964515 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8sg\" (UniqueName: \"kubernetes.io/projected/29858f05-6bc6-4f33-ae7e-e65c4737eed7-kube-api-access-5l8sg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c55fq\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:18 crc kubenswrapper[4776]: I1204 10:10:18.128075 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:18 crc kubenswrapper[4776]: I1204 10:10:18.632695 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq"] Dec 04 10:10:18 crc kubenswrapper[4776]: I1204 10:10:18.750638 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" event={"ID":"29858f05-6bc6-4f33-ae7e-e65c4737eed7","Type":"ContainerStarted","Data":"0d9e4f1eda6ed7575c6cd317ea3a5ce71466555b87a975e14692f7e70b6fdf79"} Dec 04 10:10:19 crc kubenswrapper[4776]: I1204 10:10:19.453304 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:10:19 crc kubenswrapper[4776]: I1204 10:10:19.762195 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"2ef57bb754c7648e58638acbf6214793456368538049ee0eeb6ee6e04ffa60f3"} Dec 04 10:10:19 crc kubenswrapper[4776]: I1204 10:10:19.766813 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" event={"ID":"29858f05-6bc6-4f33-ae7e-e65c4737eed7","Type":"ContainerStarted","Data":"cb95d2daacf74d0d9dd89b10049914f35825e3efe76a2ccfe35320a9b10f9162"} Dec 04 10:10:19 crc kubenswrapper[4776]: I1204 10:10:19.821316 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" podStartSLOduration=2.223860246 podStartE2EDuration="2.821291531s" podCreationTimestamp="2025-12-04 10:10:17 +0000 UTC" firstStartedPulling="2025-12-04 10:10:18.636012279 +0000 UTC m=+1863.502492656" lastFinishedPulling="2025-12-04 10:10:19.233443564 +0000 UTC m=+1864.099923941" observedRunningTime="2025-12-04 10:10:19.799543799 +0000 UTC m=+1864.666024176" watchObservedRunningTime="2025-12-04 10:10:19.821291531 +0000 UTC m=+1864.687771928" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.369252 4776 scope.go:117] "RemoveContainer" containerID="215eb17baeaabf160ea79f4430a65564a741d6d7683d8872a40f6dc4a2e91e9e" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.392629 4776 scope.go:117] "RemoveContainer" containerID="21cf6fe4b4c030a58288e7e8cb7ce5989b2ae45d534dc9813783295da995e03f" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.440430 4776 scope.go:117] "RemoveContainer" containerID="b8dc6e63c74e7238d0747cea357433abfe7422b4fe9b3446095739ad40e999e5" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.484829 4776 scope.go:117] "RemoveContainer" containerID="5729bc78d7877fa1b23598e87af657f7d2b962fb77203d5b729c4544718c5afb" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.533744 4776 scope.go:117] "RemoveContainer" containerID="d5e8e6fe56d7b7af2e687137706b3465707aad1a1e0d5a4f26267fca96eb8a92" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.559016 4776 scope.go:117] "RemoveContainer" containerID="51875faf3a5817f0207df4d4130509a0a1d19bdcedf23ab5db21c33708fedfe1" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.600718 4776 scope.go:117] "RemoveContainer" containerID="bf14193fa054783ce050dbba93b5a5a233448023726cfeecd16c960f32165542" Dec 04 10:10:20 crc kubenswrapper[4776]: I1204 10:10:20.636313 4776 scope.go:117] "RemoveContainer" containerID="13bf0549a19da56078d92bf35ceef7c3032c03631eb85ca68ff7670f3afcd526" Dec 04 10:10:24 crc kubenswrapper[4776]: I1204 10:10:24.043908 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m2qwc"] Dec 04 10:10:24 crc kubenswrapper[4776]: I1204 10:10:24.053244 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m2qwc"] Dec 04 10:10:25 crc kubenswrapper[4776]: I1204 10:10:25.466095 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ed9105-24d2-4ffd-9ee6-7dac342541de" path="/var/lib/kubelet/pods/67ed9105-24d2-4ffd-9ee6-7dac342541de/volumes" Dec 04 10:10:33 crc kubenswrapper[4776]: I1204 10:10:33.036857 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-76hnz"] Dec 04 10:10:33 crc kubenswrapper[4776]: I1204 10:10:33.045494 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-76hnz"] Dec 04 10:10:33 crc kubenswrapper[4776]: I1204 10:10:33.464428 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c107c323-1d20-4a32-82be-2085097e6d5d" path="/var/lib/kubelet/pods/c107c323-1d20-4a32-82be-2085097e6d5d/volumes" Dec 04 10:10:34 crc kubenswrapper[4776]: I1204 10:10:34.036604 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-z2nzw"] Dec 04 10:10:34 crc kubenswrapper[4776]: I1204 10:10:34.047280 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-z2nzw"] Dec 04 10:10:35 crc kubenswrapper[4776]: I1204 10:10:35.464482 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1453cb-374e-4b8d-8f13-af4be7baa997" path="/var/lib/kubelet/pods/fb1453cb-374e-4b8d-8f13-af4be7baa997/volumes" Dec 04 10:10:40 crc kubenswrapper[4776]: I1204 10:10:40.030293 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wqhr2"] Dec 04 10:10:40 crc kubenswrapper[4776]: I1204 10:10:40.041760 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wqhr2"] Dec 04 10:10:41 crc kubenswrapper[4776]: I1204 10:10:41.030748 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4bt5q"] Dec 04 10:10:41 crc kubenswrapper[4776]: I1204 10:10:41.039545 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4bt5q"] Dec 04 10:10:41 crc kubenswrapper[4776]: I1204 10:10:41.462686 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8765b919-d724-4148-8ba8-a550cd8029fc" path="/var/lib/kubelet/pods/8765b919-d724-4148-8ba8-a550cd8029fc/volumes" Dec 04 10:10:41 crc kubenswrapper[4776]: I1204 10:10:41.463704 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92fb916-9d62-42d7-bad8-12cd43af37e9" path="/var/lib/kubelet/pods/e92fb916-9d62-42d7-bad8-12cd43af37e9/volumes" Dec 04 10:10:58 crc kubenswrapper[4776]: I1204 10:10:58.108311 4776 generic.go:334] "Generic (PLEG): container finished" podID="29858f05-6bc6-4f33-ae7e-e65c4737eed7" containerID="cb95d2daacf74d0d9dd89b10049914f35825e3efe76a2ccfe35320a9b10f9162" exitCode=0 Dec 04 10:10:58 crc kubenswrapper[4776]: I1204 10:10:58.108418 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" event={"ID":"29858f05-6bc6-4f33-ae7e-e65c4737eed7","Type":"ContainerDied","Data":"cb95d2daacf74d0d9dd89b10049914f35825e3efe76a2ccfe35320a9b10f9162"} Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.575064 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.708500 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l8sg\" (UniqueName: \"kubernetes.io/projected/29858f05-6bc6-4f33-ae7e-e65c4737eed7-kube-api-access-5l8sg\") pod \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.708587 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-inventory\") pod \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.708686 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-ssh-key\") pod \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\" (UID: \"29858f05-6bc6-4f33-ae7e-e65c4737eed7\") " Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.715414 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29858f05-6bc6-4f33-ae7e-e65c4737eed7-kube-api-access-5l8sg" (OuterVolumeSpecName: "kube-api-access-5l8sg") pod "29858f05-6bc6-4f33-ae7e-e65c4737eed7" (UID: "29858f05-6bc6-4f33-ae7e-e65c4737eed7"). InnerVolumeSpecName "kube-api-access-5l8sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.741151 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-inventory" (OuterVolumeSpecName: "inventory") pod "29858f05-6bc6-4f33-ae7e-e65c4737eed7" (UID: "29858f05-6bc6-4f33-ae7e-e65c4737eed7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.742855 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29858f05-6bc6-4f33-ae7e-e65c4737eed7" (UID: "29858f05-6bc6-4f33-ae7e-e65c4737eed7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.815020 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.815061 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l8sg\" (UniqueName: \"kubernetes.io/projected/29858f05-6bc6-4f33-ae7e-e65c4737eed7-kube-api-access-5l8sg\") on node \"crc\" DevicePath \"\"" Dec 04 10:10:59 crc kubenswrapper[4776]: I1204 10:10:59.815089 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29858f05-6bc6-4f33-ae7e-e65c4737eed7-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.128783 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" event={"ID":"29858f05-6bc6-4f33-ae7e-e65c4737eed7","Type":"ContainerDied","Data":"0d9e4f1eda6ed7575c6cd317ea3a5ce71466555b87a975e14692f7e70b6fdf79"} Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.129019 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d9e4f1eda6ed7575c6cd317ea3a5ce71466555b87a975e14692f7e70b6fdf79" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.129086 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.220883 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6"] Dec 04 10:11:00 crc kubenswrapper[4776]: E1204 10:11:00.221808 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29858f05-6bc6-4f33-ae7e-e65c4737eed7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.221831 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="29858f05-6bc6-4f33-ae7e-e65c4737eed7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.222065 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="29858f05-6bc6-4f33-ae7e-e65c4737eed7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.222775 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.226126 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.226456 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.226467 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.226712 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.242943 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6"] Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.325894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9z7\" (UniqueName: \"kubernetes.io/projected/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-kube-api-access-xt9z7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.326051 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.326152 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.428092 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9z7\" (UniqueName: \"kubernetes.io/projected/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-kube-api-access-xt9z7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.428218 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.428278 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.435007 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.436768 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.451019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9z7\" (UniqueName: \"kubernetes.io/projected/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-kube-api-access-xt9z7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:00 crc kubenswrapper[4776]: I1204 10:11:00.556797 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:01 crc kubenswrapper[4776]: I1204 10:11:01.065316 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6"] Dec 04 10:11:01 crc kubenswrapper[4776]: I1204 10:11:01.139986 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" event={"ID":"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8","Type":"ContainerStarted","Data":"025e918840c79eaae9cb1a8d0ea1957dc706baf86ee755c16fd9fe385dfd2f0c"} Dec 04 10:11:02 crc kubenswrapper[4776]: I1204 10:11:02.149549 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" event={"ID":"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8","Type":"ContainerStarted","Data":"5fae47c28074ecddc2734307cf27e846ef61d3d1c86eb77b19d32ac4a5c9c935"} Dec 04 10:11:02 crc kubenswrapper[4776]: I1204 10:11:02.173595 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" podStartSLOduration=1.726516835 podStartE2EDuration="2.173577954s" podCreationTimestamp="2025-12-04 10:11:00 +0000 UTC" firstStartedPulling="2025-12-04 10:11:01.072929651 +0000 UTC m=+1905.939410028" lastFinishedPulling="2025-12-04 10:11:01.51999077 +0000 UTC m=+1906.386471147" observedRunningTime="2025-12-04 10:11:02.168404802 +0000 UTC m=+1907.034885189" watchObservedRunningTime="2025-12-04 10:11:02.173577954 +0000 UTC m=+1907.040058321" Dec 04 10:11:06 crc kubenswrapper[4776]: I1204 10:11:06.188004 4776 generic.go:334] "Generic (PLEG): container finished" podID="f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" containerID="5fae47c28074ecddc2734307cf27e846ef61d3d1c86eb77b19d32ac4a5c9c935" exitCode=0 Dec 04 10:11:06 crc kubenswrapper[4776]: I1204 10:11:06.188115 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" event={"ID":"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8","Type":"ContainerDied","Data":"5fae47c28074ecddc2734307cf27e846ef61d3d1c86eb77b19d32ac4a5c9c935"} Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.554217 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.666130 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-ssh-key\") pod \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.666257 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9z7\" (UniqueName: \"kubernetes.io/projected/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-kube-api-access-xt9z7\") pod \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.666329 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-inventory\") pod \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\" (UID: \"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8\") " Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.681206 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-kube-api-access-xt9z7" (OuterVolumeSpecName: "kube-api-access-xt9z7") pod "f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" (UID: "f802e144-1ad6-4d7c-a0fb-5935d8ae53d8"). InnerVolumeSpecName "kube-api-access-xt9z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.694959 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-inventory" (OuterVolumeSpecName: "inventory") pod "f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" (UID: "f802e144-1ad6-4d7c-a0fb-5935d8ae53d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.702426 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" (UID: "f802e144-1ad6-4d7c-a0fb-5935d8ae53d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.768373 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.768413 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9z7\" (UniqueName: \"kubernetes.io/projected/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-kube-api-access-xt9z7\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:07 crc kubenswrapper[4776]: I1204 10:11:07.768425 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.209838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" event={"ID":"f802e144-1ad6-4d7c-a0fb-5935d8ae53d8","Type":"ContainerDied","Data":"025e918840c79eaae9cb1a8d0ea1957dc706baf86ee755c16fd9fe385dfd2f0c"} Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.209881 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025e918840c79eaae9cb1a8d0ea1957dc706baf86ee755c16fd9fe385dfd2f0c" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.209930 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.308495 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7"] Dec 04 10:11:08 crc kubenswrapper[4776]: E1204 10:11:08.310461 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.310515 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.310866 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.311873 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.315399 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.315885 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.315901 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.318776 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.351686 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7"] Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.385354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.385815 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.386318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2lw7\" (UniqueName: \"kubernetes.io/projected/2f1bd233-1d50-4916-8735-65b29b8dac15-kube-api-access-c2lw7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.489159 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.489339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2lw7\" (UniqueName: \"kubernetes.io/projected/2f1bd233-1d50-4916-8735-65b29b8dac15-kube-api-access-c2lw7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.489417 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.494092 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.497608 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.509885 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2lw7\" (UniqueName: \"kubernetes.io/projected/2f1bd233-1d50-4916-8735-65b29b8dac15-kube-api-access-c2lw7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5flb7\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:08 crc kubenswrapper[4776]: I1204 10:11:08.639251 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:11:09 crc kubenswrapper[4776]: I1204 10:11:09.169670 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7"] Dec 04 10:11:09 crc kubenswrapper[4776]: I1204 10:11:09.219760 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" event={"ID":"2f1bd233-1d50-4916-8735-65b29b8dac15","Type":"ContainerStarted","Data":"efc3bb8a2a228de4bb89ebaa6a3e96b6b1cca5f9a88431b4234fa02ca5735e1f"} Dec 04 10:11:10 crc kubenswrapper[4776]: I1204 10:11:10.234835 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" event={"ID":"2f1bd233-1d50-4916-8735-65b29b8dac15","Type":"ContainerStarted","Data":"b26a21f0298a452e5e8f5c89ebc31492bf5c8bc6778fc335da75bbe49d982bf7"} Dec 04 10:11:10 crc kubenswrapper[4776]: I1204 10:11:10.266029 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" podStartSLOduration=1.821060339 podStartE2EDuration="2.266010242s" podCreationTimestamp="2025-12-04 10:11:08 +0000 UTC" firstStartedPulling="2025-12-04 10:11:09.18135929 +0000 UTC m=+1914.047839667" lastFinishedPulling="2025-12-04 10:11:09.626309193 +0000 UTC m=+1914.492789570" observedRunningTime="2025-12-04 10:11:10.25955726 +0000 UTC m=+1915.126037637" watchObservedRunningTime="2025-12-04 10:11:10.266010242 +0000 UTC m=+1915.132490619" Dec 04 10:11:20 crc kubenswrapper[4776]: I1204 10:11:20.768984 4776 scope.go:117] "RemoveContainer" containerID="9354f045692036745cc5485d76fe338549e9d13fddee8f13e3c03437b9c26cb5" Dec 04 10:11:20 crc kubenswrapper[4776]: I1204 10:11:20.799977 4776 scope.go:117] "RemoveContainer" containerID="dd61dde0dbd678bae605906865df598586c36f51ad77fc54bba91334151612b1" Dec 04 10:11:20 crc kubenswrapper[4776]: I1204 10:11:20.852363 4776 scope.go:117] "RemoveContainer" containerID="e916ce4c0ea710c0a4b38826f12fd7b949e75724ba7e9a41cc90391ee9e3198e" Dec 04 10:11:20 crc kubenswrapper[4776]: I1204 10:11:20.918473 4776 scope.go:117] "RemoveContainer" containerID="add6ce5137423dbc4ec61824b592be617854d5f0fc003d8126228cc875b1adec" Dec 04 10:11:20 crc kubenswrapper[4776]: I1204 10:11:20.964081 4776 scope.go:117] "RemoveContainer" containerID="1d35f9783f8cf066b45481ccf0829d2b056dc3a577d52a6bf4b0a45768cc5c20" Dec 04 10:11:30 crc kubenswrapper[4776]: I1204 10:11:30.039966 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7211-account-create-update-v84w9"] Dec 04 10:11:30 crc kubenswrapper[4776]: I1204 10:11:30.077249 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-k7pq8"] Dec 04 10:11:30 crc kubenswrapper[4776]: I1204 10:11:30.087085 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-n6cjq"] Dec 04 10:11:30 crc kubenswrapper[4776]: I1204 10:11:30.094882 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7211-account-create-update-v84w9"] Dec 04 10:11:30 crc kubenswrapper[4776]: I1204 10:11:30.102753 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-k7pq8"] Dec 04 10:11:30 crc kubenswrapper[4776]: I1204 10:11:30.110093 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-n6cjq"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.033716 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cda9-account-create-update-mhwlb"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.044385 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-858g8"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.051309 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7d33-account-create-update-j4l8p"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.058788 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cda9-account-create-update-mhwlb"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.066509 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7d33-account-create-update-j4l8p"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.072447 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-858g8"] Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.465130 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1556365b-47ac-4dc6-995c-60236a99c4cc" path="/var/lib/kubelet/pods/1556365b-47ac-4dc6-995c-60236a99c4cc/volumes" Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.466136 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41998dcd-34d5-4335-a5fa-8e6ceb8aac4c" path="/var/lib/kubelet/pods/41998dcd-34d5-4335-a5fa-8e6ceb8aac4c/volumes" Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.466755 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645e3327-5337-40f6-b730-817d497cf5b8" path="/var/lib/kubelet/pods/645e3327-5337-40f6-b730-817d497cf5b8/volumes" Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.467341 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732e1a2d-02d2-4754-9489-6bd42ba248e8" path="/var/lib/kubelet/pods/732e1a2d-02d2-4754-9489-6bd42ba248e8/volumes" Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.468380 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747fc25a-59ce-428f-8459-6180355f4629" path="/var/lib/kubelet/pods/747fc25a-59ce-428f-8459-6180355f4629/volumes" Dec 04 10:11:31 crc kubenswrapper[4776]: I1204 10:11:31.468903 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57e217c-6f7f-4ccc-9083-db0620a54c8d" path="/var/lib/kubelet/pods/d57e217c-6f7f-4ccc-9083-db0620a54c8d/volumes" Dec 04 10:12:02 crc kubenswrapper[4776]: I1204 10:12:02.925036 4776 generic.go:334] "Generic (PLEG): container finished" podID="2f1bd233-1d50-4916-8735-65b29b8dac15" containerID="b26a21f0298a452e5e8f5c89ebc31492bf5c8bc6778fc335da75bbe49d982bf7" exitCode=0 Dec 04 10:12:02 crc kubenswrapper[4776]: I1204 10:12:02.925581 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" event={"ID":"2f1bd233-1d50-4916-8735-65b29b8dac15","Type":"ContainerDied","Data":"b26a21f0298a452e5e8f5c89ebc31492bf5c8bc6778fc335da75bbe49d982bf7"} Dec 04 10:12:03 crc kubenswrapper[4776]: I1204 10:12:03.073043 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8jnr4"] Dec 04 10:12:03 crc kubenswrapper[4776]: I1204 10:12:03.085844 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8jnr4"] Dec 04 10:12:03 crc kubenswrapper[4776]: I1204 10:12:03.464446 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed96537-e4c0-433d-8b37-bf0e2c673816" path="/var/lib/kubelet/pods/8ed96537-e4c0-433d-8b37-bf0e2c673816/volumes" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.369680 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.473455 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2lw7\" (UniqueName: \"kubernetes.io/projected/2f1bd233-1d50-4916-8735-65b29b8dac15-kube-api-access-c2lw7\") pod \"2f1bd233-1d50-4916-8735-65b29b8dac15\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.473583 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-inventory\") pod \"2f1bd233-1d50-4916-8735-65b29b8dac15\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.473623 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-ssh-key\") pod \"2f1bd233-1d50-4916-8735-65b29b8dac15\" (UID: \"2f1bd233-1d50-4916-8735-65b29b8dac15\") " Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.482512 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1bd233-1d50-4916-8735-65b29b8dac15-kube-api-access-c2lw7" (OuterVolumeSpecName: "kube-api-access-c2lw7") pod "2f1bd233-1d50-4916-8735-65b29b8dac15" (UID: "2f1bd233-1d50-4916-8735-65b29b8dac15"). InnerVolumeSpecName "kube-api-access-c2lw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.501898 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2f1bd233-1d50-4916-8735-65b29b8dac15" (UID: "2f1bd233-1d50-4916-8735-65b29b8dac15"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.507825 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-inventory" (OuterVolumeSpecName: "inventory") pod "2f1bd233-1d50-4916-8735-65b29b8dac15" (UID: "2f1bd233-1d50-4916-8735-65b29b8dac15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.575497 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2lw7\" (UniqueName: \"kubernetes.io/projected/2f1bd233-1d50-4916-8735-65b29b8dac15-kube-api-access-c2lw7\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.575526 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.575536 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f1bd233-1d50-4916-8735-65b29b8dac15-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.942095 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" event={"ID":"2f1bd233-1d50-4916-8735-65b29b8dac15","Type":"ContainerDied","Data":"efc3bb8a2a228de4bb89ebaa6a3e96b6b1cca5f9a88431b4234fa02ca5735e1f"} Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.942132 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc3bb8a2a228de4bb89ebaa6a3e96b6b1cca5f9a88431b4234fa02ca5735e1f" Dec 04 10:12:04 crc kubenswrapper[4776]: I1204 10:12:04.942185 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.065635 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mts5t"] Dec 04 10:12:05 crc kubenswrapper[4776]: E1204 10:12:05.066143 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1bd233-1d50-4916-8735-65b29b8dac15" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.066167 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1bd233-1d50-4916-8735-65b29b8dac15" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.066422 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1bd233-1d50-4916-8735-65b29b8dac15" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.067270 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.069393 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.069553 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.069678 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.069701 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.109728 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mts5t"] Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.189659 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jgd\" (UniqueName: \"kubernetes.io/projected/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-kube-api-access-59jgd\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.189795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.189838 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.291632 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.291739 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.291855 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jgd\" (UniqueName: \"kubernetes.io/projected/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-kube-api-access-59jgd\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.299734 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.299768 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.313320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jgd\" (UniqueName: \"kubernetes.io/projected/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-kube-api-access-59jgd\") pod \"ssh-known-hosts-edpm-deployment-mts5t\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.384880 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.927880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mts5t"] Dec 04 10:12:05 crc kubenswrapper[4776]: I1204 10:12:05.960249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" event={"ID":"4282b94f-74cf-4ce8-9ab1-7235dfa27a56","Type":"ContainerStarted","Data":"d294218371aa5b8619dd6839ec11d83bcafc5e37b0c19d9726dff4daa63e53f0"} Dec 04 10:12:06 crc kubenswrapper[4776]: I1204 10:12:06.969466 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" event={"ID":"4282b94f-74cf-4ce8-9ab1-7235dfa27a56","Type":"ContainerStarted","Data":"0b86543dd5d5cfdb3f17829b330b5f97d6f4a697c5a8f31f24fc69a134c31fb1"} Dec 04 10:12:06 crc kubenswrapper[4776]: I1204 10:12:06.991890 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" podStartSLOduration=1.571217777 podStartE2EDuration="1.991874318s" podCreationTimestamp="2025-12-04 10:12:05 +0000 UTC" firstStartedPulling="2025-12-04 10:12:05.943164473 +0000 UTC m=+1970.809644850" lastFinishedPulling="2025-12-04 10:12:06.363821014 +0000 UTC m=+1971.230301391" observedRunningTime="2025-12-04 10:12:06.987405908 +0000 UTC m=+1971.853886305" watchObservedRunningTime="2025-12-04 10:12:06.991874318 +0000 UTC m=+1971.858354685" Dec 04 10:12:14 crc kubenswrapper[4776]: I1204 10:12:14.041524 4776 generic.go:334] "Generic (PLEG): container finished" podID="4282b94f-74cf-4ce8-9ab1-7235dfa27a56" containerID="0b86543dd5d5cfdb3f17829b330b5f97d6f4a697c5a8f31f24fc69a134c31fb1" exitCode=0 Dec 04 10:12:14 crc kubenswrapper[4776]: I1204 10:12:14.041724 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" event={"ID":"4282b94f-74cf-4ce8-9ab1-7235dfa27a56","Type":"ContainerDied","Data":"0b86543dd5d5cfdb3f17829b330b5f97d6f4a697c5a8f31f24fc69a134c31fb1"} Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.447007 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.600585 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-ssh-key-openstack-edpm-ipam\") pod \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.600976 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59jgd\" (UniqueName: \"kubernetes.io/projected/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-kube-api-access-59jgd\") pod \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.601184 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-inventory-0\") pod \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\" (UID: \"4282b94f-74cf-4ce8-9ab1-7235dfa27a56\") " Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.608217 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-kube-api-access-59jgd" (OuterVolumeSpecName: "kube-api-access-59jgd") pod "4282b94f-74cf-4ce8-9ab1-7235dfa27a56" (UID: "4282b94f-74cf-4ce8-9ab1-7235dfa27a56"). InnerVolumeSpecName "kube-api-access-59jgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.631491 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4282b94f-74cf-4ce8-9ab1-7235dfa27a56" (UID: "4282b94f-74cf-4ce8-9ab1-7235dfa27a56"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.641183 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4282b94f-74cf-4ce8-9ab1-7235dfa27a56" (UID: "4282b94f-74cf-4ce8-9ab1-7235dfa27a56"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.703398 4776 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.703450 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:15 crc kubenswrapper[4776]: I1204 10:12:15.703462 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59jgd\" (UniqueName: \"kubernetes.io/projected/4282b94f-74cf-4ce8-9ab1-7235dfa27a56-kube-api-access-59jgd\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.063443 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" event={"ID":"4282b94f-74cf-4ce8-9ab1-7235dfa27a56","Type":"ContainerDied","Data":"d294218371aa5b8619dd6839ec11d83bcafc5e37b0c19d9726dff4daa63e53f0"} Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.063491 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d294218371aa5b8619dd6839ec11d83bcafc5e37b0c19d9726dff4daa63e53f0" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.063561 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mts5t" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.149357 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv"] Dec 04 10:12:16 crc kubenswrapper[4776]: E1204 10:12:16.149851 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4282b94f-74cf-4ce8-9ab1-7235dfa27a56" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.149873 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4282b94f-74cf-4ce8-9ab1-7235dfa27a56" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.150129 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4282b94f-74cf-4ce8-9ab1-7235dfa27a56" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.150936 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.152675 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.153423 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.153857 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.153908 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.171599 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv"] Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.212160 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.212250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.212329 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h759\" (UniqueName: \"kubernetes.io/projected/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-kube-api-access-4h759\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.314250 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h759\" (UniqueName: \"kubernetes.io/projected/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-kube-api-access-4h759\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.314384 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.314476 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.318441 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.323476 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.332145 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h759\" (UniqueName: \"kubernetes.io/projected/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-kube-api-access-4h759\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z7zdv\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.470259 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:16 crc kubenswrapper[4776]: I1204 10:12:16.979163 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv"] Dec 04 10:12:17 crc kubenswrapper[4776]: I1204 10:12:17.071237 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" event={"ID":"8ef0edf3-6a05-41cf-af58-f98b8b7492fa","Type":"ContainerStarted","Data":"50504a5b754124cf3422f8e7a0c41c6320fe6597d0e12370920aa5de254e8d8f"} Dec 04 10:12:19 crc kubenswrapper[4776]: I1204 10:12:19.090135 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" event={"ID":"8ef0edf3-6a05-41cf-af58-f98b8b7492fa","Type":"ContainerStarted","Data":"fa1993096d4e071b0bfe8f7ba825de1f6c6fa31bc2325cbc97d6884162d7f6de"} Dec 04 10:12:19 crc kubenswrapper[4776]: I1204 10:12:19.113204 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" podStartSLOduration=2.004557087 podStartE2EDuration="3.11318188s" podCreationTimestamp="2025-12-04 10:12:16 +0000 UTC" firstStartedPulling="2025-12-04 10:12:16.987526926 +0000 UTC m=+1981.854007303" lastFinishedPulling="2025-12-04 10:12:18.096151719 +0000 UTC m=+1982.962632096" observedRunningTime="2025-12-04 10:12:19.105415157 +0000 UTC m=+1983.971895534" watchObservedRunningTime="2025-12-04 10:12:19.11318188 +0000 UTC m=+1983.979662257" Dec 04 10:12:19 crc kubenswrapper[4776]: I1204 10:12:19.379582 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:12:19 crc kubenswrapper[4776]: I1204 10:12:19.379958 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.095491 4776 scope.go:117] "RemoveContainer" containerID="03b7e9728273558121b71c4da2ebfe7ed7cfd1eb9b9deaf34148432e1c81bb73" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.132809 4776 scope.go:117] "RemoveContainer" containerID="3d985c2f5e44fcc35110063a7f39dfaf8028f5883eeb56b8b63aca84828480a4" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.191202 4776 scope.go:117] "RemoveContainer" containerID="9d06582bd9a4bc299fa642414be26627c5b4cff36919a19976ea357175b4e8f6" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.226794 4776 scope.go:117] "RemoveContainer" containerID="218ecda0a844e63a1bc441cc11be3bac531cabb28bd73bce33ce24ac587a079b" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.270592 4776 scope.go:117] "RemoveContainer" containerID="e9c408e16347b3639fbae337de308b7d6acfb1b7325d74cd39225fe260d22809" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.348794 4776 scope.go:117] "RemoveContainer" containerID="d00caf7ed44f8322e580717ac80cec1f1525d8cac5270dfa8652f866b238a26d" Dec 04 10:12:21 crc kubenswrapper[4776]: I1204 10:12:21.406857 4776 scope.go:117] "RemoveContainer" containerID="9c8ffff73df9a5eb2b923bf7652fbd57633ac3f23fd574b99637dbd86d54ae86" Dec 04 10:12:26 crc kubenswrapper[4776]: I1204 10:12:26.033210 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9cq9"] Dec 04 10:12:26 crc kubenswrapper[4776]: I1204 10:12:26.049809 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9cq9"] Dec 04 10:12:27 crc kubenswrapper[4776]: I1204 10:12:27.034831 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wpwrq"] Dec 04 10:12:27 crc kubenswrapper[4776]: I1204 10:12:27.046637 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wpwrq"] Dec 04 10:12:27 crc kubenswrapper[4776]: I1204 10:12:27.163537 4776 generic.go:334] "Generic (PLEG): container finished" podID="8ef0edf3-6a05-41cf-af58-f98b8b7492fa" containerID="fa1993096d4e071b0bfe8f7ba825de1f6c6fa31bc2325cbc97d6884162d7f6de" exitCode=0 Dec 04 10:12:27 crc kubenswrapper[4776]: I1204 10:12:27.163588 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" event={"ID":"8ef0edf3-6a05-41cf-af58-f98b8b7492fa","Type":"ContainerDied","Data":"fa1993096d4e071b0bfe8f7ba825de1f6c6fa31bc2325cbc97d6884162d7f6de"} Dec 04 10:12:27 crc kubenswrapper[4776]: I1204 10:12:27.467063 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae6f2de-abd7-4410-b165-82a134e89e93" path="/var/lib/kubelet/pods/9ae6f2de-abd7-4410-b165-82a134e89e93/volumes" Dec 04 10:12:27 crc kubenswrapper[4776]: I1204 10:12:27.467942 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ea013b-0a00-4108-9e20-ac957fdbf524" path="/var/lib/kubelet/pods/d9ea013b-0a00-4108-9e20-ac957fdbf524/volumes" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.571269 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.648665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-inventory\") pod \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.648756 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h759\" (UniqueName: \"kubernetes.io/projected/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-kube-api-access-4h759\") pod \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.648851 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-ssh-key\") pod \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\" (UID: \"8ef0edf3-6a05-41cf-af58-f98b8b7492fa\") " Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.654253 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-kube-api-access-4h759" (OuterVolumeSpecName: "kube-api-access-4h759") pod "8ef0edf3-6a05-41cf-af58-f98b8b7492fa" (UID: "8ef0edf3-6a05-41cf-af58-f98b8b7492fa"). InnerVolumeSpecName "kube-api-access-4h759". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.676432 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-inventory" (OuterVolumeSpecName: "inventory") pod "8ef0edf3-6a05-41cf-af58-f98b8b7492fa" (UID: "8ef0edf3-6a05-41cf-af58-f98b8b7492fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.678639 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8ef0edf3-6a05-41cf-af58-f98b8b7492fa" (UID: "8ef0edf3-6a05-41cf-af58-f98b8b7492fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.751021 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.751058 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:28 crc kubenswrapper[4776]: I1204 10:12:28.751069 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h759\" (UniqueName: \"kubernetes.io/projected/8ef0edf3-6a05-41cf-af58-f98b8b7492fa-kube-api-access-4h759\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.181457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" event={"ID":"8ef0edf3-6a05-41cf-af58-f98b8b7492fa","Type":"ContainerDied","Data":"50504a5b754124cf3422f8e7a0c41c6320fe6597d0e12370920aa5de254e8d8f"} Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.181505 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50504a5b754124cf3422f8e7a0c41c6320fe6597d0e12370920aa5de254e8d8f" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.181513 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.258013 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp"] Dec 04 10:12:29 crc kubenswrapper[4776]: E1204 10:12:29.258423 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef0edf3-6a05-41cf-af58-f98b8b7492fa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.258443 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef0edf3-6a05-41cf-af58-f98b8b7492fa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.258668 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef0edf3-6a05-41cf-af58-f98b8b7492fa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.259380 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.262144 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.262177 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.262264 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.262626 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.269048 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp"] Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.361341 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.361428 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v245v\" (UniqueName: \"kubernetes.io/projected/13a4c21c-5146-4d8d-9c2a-13c081f134c8-kube-api-access-v245v\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.361478 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.463212 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v245v\" (UniqueName: \"kubernetes.io/projected/13a4c21c-5146-4d8d-9c2a-13c081f134c8-kube-api-access-v245v\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.463280 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.463400 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.468151 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.482233 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.482956 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v245v\" (UniqueName: \"kubernetes.io/projected/13a4c21c-5146-4d8d-9c2a-13c081f134c8-kube-api-access-v245v\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:29 crc kubenswrapper[4776]: I1204 10:12:29.575732 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:30 crc kubenswrapper[4776]: I1204 10:12:30.127995 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp"] Dec 04 10:12:30 crc kubenswrapper[4776]: I1204 10:12:30.191229 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" event={"ID":"13a4c21c-5146-4d8d-9c2a-13c081f134c8","Type":"ContainerStarted","Data":"6a0968b0147845d165e7adbf100f14821e0c69675a189206382ef57fd019e530"} Dec 04 10:12:31 crc kubenswrapper[4776]: I1204 10:12:31.204407 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" event={"ID":"13a4c21c-5146-4d8d-9c2a-13c081f134c8","Type":"ContainerStarted","Data":"5d2cedd2be5a6ffc816bc3117d5b566cbff7772a0f331660defb9e15af676bab"} Dec 04 10:12:31 crc kubenswrapper[4776]: I1204 10:12:31.234464 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" podStartSLOduration=1.749094862 podStartE2EDuration="2.234446742s" podCreationTimestamp="2025-12-04 10:12:29 +0000 UTC" firstStartedPulling="2025-12-04 10:12:30.16353248 +0000 UTC m=+1995.030012857" lastFinishedPulling="2025-12-04 10:12:30.64888436 +0000 UTC m=+1995.515364737" observedRunningTime="2025-12-04 10:12:31.222795786 +0000 UTC m=+1996.089276193" watchObservedRunningTime="2025-12-04 10:12:31.234446742 +0000 UTC m=+1996.100927119" Dec 04 10:12:41 crc kubenswrapper[4776]: I1204 10:12:41.287800 4776 generic.go:334] "Generic (PLEG): container finished" podID="13a4c21c-5146-4d8d-9c2a-13c081f134c8" containerID="5d2cedd2be5a6ffc816bc3117d5b566cbff7772a0f331660defb9e15af676bab" exitCode=0 Dec 04 10:12:41 crc kubenswrapper[4776]: I1204 10:12:41.287892 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" event={"ID":"13a4c21c-5146-4d8d-9c2a-13c081f134c8","Type":"ContainerDied","Data":"5d2cedd2be5a6ffc816bc3117d5b566cbff7772a0f331660defb9e15af676bab"} Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.699635 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.876753 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-inventory\") pod \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.876876 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v245v\" (UniqueName: \"kubernetes.io/projected/13a4c21c-5146-4d8d-9c2a-13c081f134c8-kube-api-access-v245v\") pod \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.877003 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-ssh-key\") pod \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\" (UID: \"13a4c21c-5146-4d8d-9c2a-13c081f134c8\") " Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.882204 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a4c21c-5146-4d8d-9c2a-13c081f134c8-kube-api-access-v245v" (OuterVolumeSpecName: "kube-api-access-v245v") pod "13a4c21c-5146-4d8d-9c2a-13c081f134c8" (UID: "13a4c21c-5146-4d8d-9c2a-13c081f134c8"). InnerVolumeSpecName "kube-api-access-v245v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.904574 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-inventory" (OuterVolumeSpecName: "inventory") pod "13a4c21c-5146-4d8d-9c2a-13c081f134c8" (UID: "13a4c21c-5146-4d8d-9c2a-13c081f134c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.904634 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13a4c21c-5146-4d8d-9c2a-13c081f134c8" (UID: "13a4c21c-5146-4d8d-9c2a-13c081f134c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.979302 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.979548 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v245v\" (UniqueName: \"kubernetes.io/projected/13a4c21c-5146-4d8d-9c2a-13c081f134c8-kube-api-access-v245v\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:42 crc kubenswrapper[4776]: I1204 10:12:42.979619 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13a4c21c-5146-4d8d-9c2a-13c081f134c8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:12:43 crc kubenswrapper[4776]: I1204 10:12:43.308433 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" event={"ID":"13a4c21c-5146-4d8d-9c2a-13c081f134c8","Type":"ContainerDied","Data":"6a0968b0147845d165e7adbf100f14821e0c69675a189206382ef57fd019e530"} Dec 04 10:12:43 crc kubenswrapper[4776]: I1204 10:12:43.308487 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp" Dec 04 10:12:43 crc kubenswrapper[4776]: I1204 10:12:43.308488 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0968b0147845d165e7adbf100f14821e0c69675a189206382ef57fd019e530" Dec 04 10:12:49 crc kubenswrapper[4776]: I1204 10:12:49.380377 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:12:49 crc kubenswrapper[4776]: I1204 10:12:49.380937 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.138421 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-stwtx"] Dec 04 10:13:11 crc kubenswrapper[4776]: E1204 10:13:11.139542 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a4c21c-5146-4d8d-9c2a-13c081f134c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.139561 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a4c21c-5146-4d8d-9c2a-13c081f134c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.139787 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a4c21c-5146-4d8d-9c2a-13c081f134c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.145537 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.151318 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kkb8\" (UniqueName: \"kubernetes.io/projected/a6e6ac59-ded8-472e-b7c5-24961933e89e-kube-api-access-7kkb8\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.151422 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-utilities\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.151498 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-catalog-content\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.160261 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-stwtx"] Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.253278 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kkb8\" (UniqueName: \"kubernetes.io/projected/a6e6ac59-ded8-472e-b7c5-24961933e89e-kube-api-access-7kkb8\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.253399 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-utilities\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.253488 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-catalog-content\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.254103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-utilities\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.254111 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-catalog-content\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.275102 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kkb8\" (UniqueName: \"kubernetes.io/projected/a6e6ac59-ded8-472e-b7c5-24961933e89e-kube-api-access-7kkb8\") pod \"certified-operators-stwtx\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:11 crc kubenswrapper[4776]: I1204 10:13:11.477360 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:12 crc kubenswrapper[4776]: I1204 10:13:12.050785 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mmks9"] Dec 04 10:13:12 crc kubenswrapper[4776]: I1204 10:13:12.055619 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mmks9"] Dec 04 10:13:12 crc kubenswrapper[4776]: I1204 10:13:12.108791 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-stwtx"] Dec 04 10:13:12 crc kubenswrapper[4776]: E1204 10:13:12.670435 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e6ac59_ded8_472e_b7c5_24961933e89e.slice/crio-conmon-88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:13:12 crc kubenswrapper[4776]: I1204 10:13:12.694010 4776 generic.go:334] "Generic (PLEG): container finished" podID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerID="88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b" exitCode=0 Dec 04 10:13:12 crc kubenswrapper[4776]: I1204 10:13:12.694072 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stwtx" event={"ID":"a6e6ac59-ded8-472e-b7c5-24961933e89e","Type":"ContainerDied","Data":"88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b"} Dec 04 10:13:12 crc kubenswrapper[4776]: I1204 10:13:12.694096 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stwtx" event={"ID":"a6e6ac59-ded8-472e-b7c5-24961933e89e","Type":"ContainerStarted","Data":"e407630905f88242aa1a6fdeef310f27b977e5eb6d20b44f761ab582bf4d1bbf"} Dec 04 10:13:13 crc kubenswrapper[4776]: I1204 10:13:13.468549 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4e1405-c0c0-4fb1-a9bb-a93612a2528b" path="/var/lib/kubelet/pods/4a4e1405-c0c0-4fb1-a9bb-a93612a2528b/volumes" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.184013 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tm2m8"] Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.186546 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.195510 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tm2m8"] Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.385670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-utilities\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.385723 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ttxj\" (UniqueName: \"kubernetes.io/projected/01a73590-e344-4adf-bda3-34b8835a7695-kube-api-access-5ttxj\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.385758 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-catalog-content\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.486954 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-utilities\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.487012 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ttxj\" (UniqueName: \"kubernetes.io/projected/01a73590-e344-4adf-bda3-34b8835a7695-kube-api-access-5ttxj\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.487042 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-catalog-content\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.487668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-utilities\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.487708 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-catalog-content\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.507770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ttxj\" (UniqueName: \"kubernetes.io/projected/01a73590-e344-4adf-bda3-34b8835a7695-kube-api-access-5ttxj\") pod \"redhat-operators-tm2m8\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.523581 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.727215 4776 generic.go:334] "Generic (PLEG): container finished" podID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerID="2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac" exitCode=0 Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.727572 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stwtx" event={"ID":"a6e6ac59-ded8-472e-b7c5-24961933e89e","Type":"ContainerDied","Data":"2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac"} Dec 04 10:13:14 crc kubenswrapper[4776]: I1204 10:13:14.975177 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tm2m8"] Dec 04 10:13:14 crc kubenswrapper[4776]: W1204 10:13:14.980650 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01a73590_e344_4adf_bda3_34b8835a7695.slice/crio-ccb6888a18a8135d9e3973cbbde77591af0d70cbc26f4b4f46c75a3e7fb197d5 WatchSource:0}: Error finding container ccb6888a18a8135d9e3973cbbde77591af0d70cbc26f4b4f46c75a3e7fb197d5: Status 404 returned error can't find the container with id ccb6888a18a8135d9e3973cbbde77591af0d70cbc26f4b4f46c75a3e7fb197d5 Dec 04 10:13:15 crc kubenswrapper[4776]: I1204 10:13:15.736895 4776 generic.go:334] "Generic (PLEG): container finished" podID="01a73590-e344-4adf-bda3-34b8835a7695" containerID="443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f" exitCode=0 Dec 04 10:13:15 crc kubenswrapper[4776]: I1204 10:13:15.737030 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerDied","Data":"443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f"} Dec 04 10:13:15 crc kubenswrapper[4776]: I1204 10:13:15.737344 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerStarted","Data":"ccb6888a18a8135d9e3973cbbde77591af0d70cbc26f4b4f46c75a3e7fb197d5"} Dec 04 10:13:15 crc kubenswrapper[4776]: I1204 10:13:15.740272 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stwtx" event={"ID":"a6e6ac59-ded8-472e-b7c5-24961933e89e","Type":"ContainerStarted","Data":"949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294"} Dec 04 10:13:15 crc kubenswrapper[4776]: I1204 10:13:15.795369 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-stwtx" podStartSLOduration=2.341483687 podStartE2EDuration="4.795344357s" podCreationTimestamp="2025-12-04 10:13:11 +0000 UTC" firstStartedPulling="2025-12-04 10:13:12.695965649 +0000 UTC m=+2037.562446026" lastFinishedPulling="2025-12-04 10:13:15.149826319 +0000 UTC m=+2040.016306696" observedRunningTime="2025-12-04 10:13:15.792536239 +0000 UTC m=+2040.659016636" watchObservedRunningTime="2025-12-04 10:13:15.795344357 +0000 UTC m=+2040.661824734" Dec 04 10:13:16 crc kubenswrapper[4776]: I1204 10:13:16.754196 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerStarted","Data":"e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca"} Dec 04 10:13:18 crc kubenswrapper[4776]: I1204 10:13:18.776073 4776 generic.go:334] "Generic (PLEG): container finished" podID="01a73590-e344-4adf-bda3-34b8835a7695" containerID="e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca" exitCode=0 Dec 04 10:13:18 crc kubenswrapper[4776]: I1204 10:13:18.776154 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerDied","Data":"e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca"} Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.380572 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.380647 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.380697 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.381513 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ef57bb754c7648e58638acbf6214793456368538049ee0eeb6ee6e04ffa60f3"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.381569 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://2ef57bb754c7648e58638acbf6214793456368538049ee0eeb6ee6e04ffa60f3" gracePeriod=600 Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.788240 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="2ef57bb754c7648e58638acbf6214793456368538049ee0eeb6ee6e04ffa60f3" exitCode=0 Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.788323 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"2ef57bb754c7648e58638acbf6214793456368538049ee0eeb6ee6e04ffa60f3"} Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.788541 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572"} Dec 04 10:13:19 crc kubenswrapper[4776]: I1204 10:13:19.788568 4776 scope.go:117] "RemoveContainer" containerID="76526a3b83d108443b77b09fbfb9f31daea74a7b731f32d6b6c66786745eff96" Dec 04 10:13:20 crc kubenswrapper[4776]: I1204 10:13:20.801445 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerStarted","Data":"a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8"} Dec 04 10:13:20 crc kubenswrapper[4776]: I1204 10:13:20.827308 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tm2m8" podStartSLOduration=2.897291091 podStartE2EDuration="6.827288185s" podCreationTimestamp="2025-12-04 10:13:14 +0000 UTC" firstStartedPulling="2025-12-04 10:13:15.738630468 +0000 UTC m=+2040.605110845" lastFinishedPulling="2025-12-04 10:13:19.668627542 +0000 UTC m=+2044.535107939" observedRunningTime="2025-12-04 10:13:20.820530154 +0000 UTC m=+2045.687010521" watchObservedRunningTime="2025-12-04 10:13:20.827288185 +0000 UTC m=+2045.693768562" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.478505 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.478554 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.529619 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.563690 4776 scope.go:117] "RemoveContainer" containerID="f0eb881180561bed3f783cfd79feaeae18711639ead36828bc866e32475aee2b" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.606011 4776 scope.go:117] "RemoveContainer" containerID="67ee3f123aa0b70b77f3a9940f957cd1fde7c13804b6ff9f1b7add78f5ab4540" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.663389 4776 scope.go:117] "RemoveContainer" containerID="0fad2eb09a04b456b7c4f3cc1e90eb7bfdcb5860475f5763188003645156b94b" Dec 04 10:13:21 crc kubenswrapper[4776]: I1204 10:13:21.874021 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:24 crc kubenswrapper[4776]: I1204 10:13:24.524494 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:24 crc kubenswrapper[4776]: I1204 10:13:24.525336 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:24 crc kubenswrapper[4776]: I1204 10:13:24.930982 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-stwtx"] Dec 04 10:13:24 crc kubenswrapper[4776]: I1204 10:13:24.931553 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-stwtx" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="registry-server" containerID="cri-o://949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294" gracePeriod=2 Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.384662 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.433459 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kkb8\" (UniqueName: \"kubernetes.io/projected/a6e6ac59-ded8-472e-b7c5-24961933e89e-kube-api-access-7kkb8\") pod \"a6e6ac59-ded8-472e-b7c5-24961933e89e\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.433580 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-catalog-content\") pod \"a6e6ac59-ded8-472e-b7c5-24961933e89e\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.433698 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-utilities\") pod \"a6e6ac59-ded8-472e-b7c5-24961933e89e\" (UID: \"a6e6ac59-ded8-472e-b7c5-24961933e89e\") " Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.434611 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-utilities" (OuterVolumeSpecName: "utilities") pod "a6e6ac59-ded8-472e-b7c5-24961933e89e" (UID: "a6e6ac59-ded8-472e-b7c5-24961933e89e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.441726 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e6ac59-ded8-472e-b7c5-24961933e89e-kube-api-access-7kkb8" (OuterVolumeSpecName: "kube-api-access-7kkb8") pod "a6e6ac59-ded8-472e-b7c5-24961933e89e" (UID: "a6e6ac59-ded8-472e-b7c5-24961933e89e"). InnerVolumeSpecName "kube-api-access-7kkb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.495438 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6e6ac59-ded8-472e-b7c5-24961933e89e" (UID: "a6e6ac59-ded8-472e-b7c5-24961933e89e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.535799 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.535836 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kkb8\" (UniqueName: \"kubernetes.io/projected/a6e6ac59-ded8-472e-b7c5-24961933e89e-kube-api-access-7kkb8\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.535847 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e6ac59-ded8-472e-b7c5-24961933e89e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.575276 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tm2m8" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="registry-server" probeResult="failure" output=< Dec 04 10:13:25 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 10:13:25 crc kubenswrapper[4776]: > Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.856412 4776 generic.go:334] "Generic (PLEG): container finished" podID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerID="949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294" exitCode=0 Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.856481 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stwtx" event={"ID":"a6e6ac59-ded8-472e-b7c5-24961933e89e","Type":"ContainerDied","Data":"949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294"} Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.856523 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-stwtx" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.856570 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-stwtx" event={"ID":"a6e6ac59-ded8-472e-b7c5-24961933e89e","Type":"ContainerDied","Data":"e407630905f88242aa1a6fdeef310f27b977e5eb6d20b44f761ab582bf4d1bbf"} Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.856607 4776 scope.go:117] "RemoveContainer" containerID="949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.881548 4776 scope.go:117] "RemoveContainer" containerID="2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.921756 4776 scope.go:117] "RemoveContainer" containerID="88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.969810 4776 scope.go:117] "RemoveContainer" containerID="949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294" Dec 04 10:13:25 crc kubenswrapper[4776]: E1204 10:13:25.970509 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294\": container with ID starting with 949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294 not found: ID does not exist" containerID="949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.970552 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294"} err="failed to get container status \"949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294\": rpc error: code = NotFound desc = could not find container \"949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294\": container with ID starting with 949ecae3956e888e8a28bf83b090538c903ed9615722a5cc62cf20db6d723294 not found: ID does not exist" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.970600 4776 scope.go:117] "RemoveContainer" containerID="2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac" Dec 04 10:13:25 crc kubenswrapper[4776]: E1204 10:13:25.971011 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac\": container with ID starting with 2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac not found: ID does not exist" containerID="2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.971046 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac"} err="failed to get container status \"2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac\": rpc error: code = NotFound desc = could not find container \"2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac\": container with ID starting with 2de63601bada8d6cc743326e5712f35deee88cec9b8499c611a8c8207f8fdeac not found: ID does not exist" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.971072 4776 scope.go:117] "RemoveContainer" containerID="88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b" Dec 04 10:13:25 crc kubenswrapper[4776]: E1204 10:13:25.971601 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b\": container with ID starting with 88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b not found: ID does not exist" containerID="88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.971629 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b"} err="failed to get container status \"88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b\": rpc error: code = NotFound desc = could not find container \"88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b\": container with ID starting with 88f903687e6a7673ed39c957ce79c929ace365bb1da6076c026b842f50fcca0b not found: ID does not exist" Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.978404 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-stwtx"] Dec 04 10:13:25 crc kubenswrapper[4776]: I1204 10:13:25.992778 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-stwtx"] Dec 04 10:13:27 crc kubenswrapper[4776]: I1204 10:13:27.463549 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" path="/var/lib/kubelet/pods/a6e6ac59-ded8-472e-b7c5-24961933e89e/volumes" Dec 04 10:13:34 crc kubenswrapper[4776]: I1204 10:13:34.574612 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:34 crc kubenswrapper[4776]: I1204 10:13:34.623738 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:34 crc kubenswrapper[4776]: I1204 10:13:34.809257 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tm2m8"] Dec 04 10:13:35 crc kubenswrapper[4776]: I1204 10:13:35.943738 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tm2m8" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="registry-server" containerID="cri-o://a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8" gracePeriod=2 Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.362148 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.548124 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-catalog-content\") pod \"01a73590-e344-4adf-bda3-34b8835a7695\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.548242 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-utilities\") pod \"01a73590-e344-4adf-bda3-34b8835a7695\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.548325 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ttxj\" (UniqueName: \"kubernetes.io/projected/01a73590-e344-4adf-bda3-34b8835a7695-kube-api-access-5ttxj\") pod \"01a73590-e344-4adf-bda3-34b8835a7695\" (UID: \"01a73590-e344-4adf-bda3-34b8835a7695\") " Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.551176 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-utilities" (OuterVolumeSpecName: "utilities") pod "01a73590-e344-4adf-bda3-34b8835a7695" (UID: "01a73590-e344-4adf-bda3-34b8835a7695"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.558496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a73590-e344-4adf-bda3-34b8835a7695-kube-api-access-5ttxj" (OuterVolumeSpecName: "kube-api-access-5ttxj") pod "01a73590-e344-4adf-bda3-34b8835a7695" (UID: "01a73590-e344-4adf-bda3-34b8835a7695"). InnerVolumeSpecName "kube-api-access-5ttxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.650272 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.650304 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ttxj\" (UniqueName: \"kubernetes.io/projected/01a73590-e344-4adf-bda3-34b8835a7695-kube-api-access-5ttxj\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.669315 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01a73590-e344-4adf-bda3-34b8835a7695" (UID: "01a73590-e344-4adf-bda3-34b8835a7695"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.751686 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a73590-e344-4adf-bda3-34b8835a7695-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.959355 4776 generic.go:334] "Generic (PLEG): container finished" podID="01a73590-e344-4adf-bda3-34b8835a7695" containerID="a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8" exitCode=0 Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.959431 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tm2m8" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.959457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerDied","Data":"a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8"} Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.960851 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tm2m8" event={"ID":"01a73590-e344-4adf-bda3-34b8835a7695","Type":"ContainerDied","Data":"ccb6888a18a8135d9e3973cbbde77591af0d70cbc26f4b4f46c75a3e7fb197d5"} Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.960967 4776 scope.go:117] "RemoveContainer" containerID="a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8" Dec 04 10:13:36 crc kubenswrapper[4776]: I1204 10:13:36.993168 4776 scope.go:117] "RemoveContainer" containerID="e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:36.999046 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tm2m8"] Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.010479 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tm2m8"] Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.087267 4776 scope.go:117] "RemoveContainer" containerID="443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.172769 4776 scope.go:117] "RemoveContainer" containerID="a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8" Dec 04 10:13:37 crc kubenswrapper[4776]: E1204 10:13:37.175239 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8\": container with ID starting with a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8 not found: ID does not exist" containerID="a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.175288 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8"} err="failed to get container status \"a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8\": rpc error: code = NotFound desc = could not find container \"a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8\": container with ID starting with a92636e72ac8b2b42bd260ca60a479a6f986f4028169f15df8497f16bf2759c8 not found: ID does not exist" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.175321 4776 scope.go:117] "RemoveContainer" containerID="e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca" Dec 04 10:13:37 crc kubenswrapper[4776]: E1204 10:13:37.179287 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca\": container with ID starting with e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca not found: ID does not exist" containerID="e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.179328 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca"} err="failed to get container status \"e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca\": rpc error: code = NotFound desc = could not find container \"e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca\": container with ID starting with e8f8d16b5a905f47f4212970a6f2076a308a880f39a25b03adbab7ce9eb23cca not found: ID does not exist" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.179354 4776 scope.go:117] "RemoveContainer" containerID="443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f" Dec 04 10:13:37 crc kubenswrapper[4776]: E1204 10:13:37.183311 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f\": container with ID starting with 443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f not found: ID does not exist" containerID="443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.183350 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f"} err="failed to get container status \"443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f\": rpc error: code = NotFound desc = could not find container \"443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f\": container with ID starting with 443987b60f77b40776687b01088ca820021e3d6f57302b64f2cb8e20b8860f4f not found: ID does not exist" Dec 04 10:13:37 crc kubenswrapper[4776]: I1204 10:13:37.466345 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a73590-e344-4adf-bda3-34b8835a7695" path="/var/lib/kubelet/pods/01a73590-e344-4adf-bda3-34b8835a7695/volumes" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.156985 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4"] Dec 04 10:15:00 crc kubenswrapper[4776]: E1204 10:15:00.158677 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="extract-content" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.158705 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="extract-content" Dec 04 10:15:00 crc kubenswrapper[4776]: E1204 10:15:00.158726 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="extract-utilities" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.158734 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="extract-utilities" Dec 04 10:15:00 crc kubenswrapper[4776]: E1204 10:15:00.158743 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="extract-content" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.158750 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="extract-content" Dec 04 10:15:00 crc kubenswrapper[4776]: E1204 10:15:00.158777 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="registry-server" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.158783 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="registry-server" Dec 04 10:15:00 crc kubenswrapper[4776]: E1204 10:15:00.158800 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="registry-server" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.158807 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="registry-server" Dec 04 10:15:00 crc kubenswrapper[4776]: E1204 10:15:00.158828 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="extract-utilities" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.158835 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="extract-utilities" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.159053 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e6ac59-ded8-472e-b7c5-24961933e89e" containerName="registry-server" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.159083 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a73590-e344-4adf-bda3-34b8835a7695" containerName="registry-server" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.160170 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.165268 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.166530 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.168482 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4"] Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.263304 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-secret-volume\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.263656 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztl57\" (UniqueName: \"kubernetes.io/projected/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-kube-api-access-ztl57\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.263881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-config-volume\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.365641 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-config-volume\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.366133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-secret-volume\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.366313 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztl57\" (UniqueName: \"kubernetes.io/projected/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-kube-api-access-ztl57\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.366799 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-config-volume\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.373648 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-secret-volume\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.389022 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztl57\" (UniqueName: \"kubernetes.io/projected/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-kube-api-access-ztl57\") pod \"collect-profiles-29414055-ccgt4\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.490518 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:00 crc kubenswrapper[4776]: I1204 10:15:00.933772 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4"] Dec 04 10:15:01 crc kubenswrapper[4776]: I1204 10:15:01.686444 4776 generic.go:334] "Generic (PLEG): container finished" podID="5d0dc27f-b4da-4930-9196-3c2d4c21aee2" containerID="fbeaf8fe063b7431f9a1de0a404485132d9488982dfe6bc7aae07aadfdc9f89c" exitCode=0 Dec 04 10:15:01 crc kubenswrapper[4776]: I1204 10:15:01.686669 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" event={"ID":"5d0dc27f-b4da-4930-9196-3c2d4c21aee2","Type":"ContainerDied","Data":"fbeaf8fe063b7431f9a1de0a404485132d9488982dfe6bc7aae07aadfdc9f89c"} Dec 04 10:15:01 crc kubenswrapper[4776]: I1204 10:15:01.686748 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" event={"ID":"5d0dc27f-b4da-4930-9196-3c2d4c21aee2","Type":"ContainerStarted","Data":"5ca170858c03627874f6c2cd71587b58866aed90c84b621e39b8fe920d5564fb"} Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.040777 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.123180 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztl57\" (UniqueName: \"kubernetes.io/projected/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-kube-api-access-ztl57\") pod \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.123281 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-secret-volume\") pod \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.123433 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-config-volume\") pod \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\" (UID: \"5d0dc27f-b4da-4930-9196-3c2d4c21aee2\") " Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.124470 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d0dc27f-b4da-4930-9196-3c2d4c21aee2" (UID: "5d0dc27f-b4da-4930-9196-3c2d4c21aee2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.129587 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d0dc27f-b4da-4930-9196-3c2d4c21aee2" (UID: "5d0dc27f-b4da-4930-9196-3c2d4c21aee2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.130091 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-kube-api-access-ztl57" (OuterVolumeSpecName: "kube-api-access-ztl57") pod "5d0dc27f-b4da-4930-9196-3c2d4c21aee2" (UID: "5d0dc27f-b4da-4930-9196-3c2d4c21aee2"). InnerVolumeSpecName "kube-api-access-ztl57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.224704 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.225044 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztl57\" (UniqueName: \"kubernetes.io/projected/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-kube-api-access-ztl57\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.225060 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d0dc27f-b4da-4930-9196-3c2d4c21aee2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.703469 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" event={"ID":"5d0dc27f-b4da-4930-9196-3c2d4c21aee2","Type":"ContainerDied","Data":"5ca170858c03627874f6c2cd71587b58866aed90c84b621e39b8fe920d5564fb"} Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.703518 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca170858c03627874f6c2cd71587b58866aed90c84b621e39b8fe920d5564fb" Dec 04 10:15:03 crc kubenswrapper[4776]: I1204 10:15:03.703533 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4" Dec 04 10:15:04 crc kubenswrapper[4776]: I1204 10:15:04.123937 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f"] Dec 04 10:15:04 crc kubenswrapper[4776]: I1204 10:15:04.131872 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414010-bp48f"] Dec 04 10:15:05 crc kubenswrapper[4776]: I1204 10:15:05.461653 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17" path="/var/lib/kubelet/pods/a4ed5a1c-5c95-409b-8f5b-cbcd4b334c17/volumes" Dec 04 10:15:19 crc kubenswrapper[4776]: I1204 10:15:19.380316 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:15:19 crc kubenswrapper[4776]: I1204 10:15:19.381792 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:15:21 crc kubenswrapper[4776]: I1204 10:15:21.806685 4776 scope.go:117] "RemoveContainer" containerID="47660be036b97a021b847481665ad6a5fc760f50a090b30a8cb4fd1caa779898" Dec 04 10:15:49 crc kubenswrapper[4776]: I1204 10:15:49.380419 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:15:49 crc kubenswrapper[4776]: I1204 10:15:49.381089 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.184082 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dkln7"] Dec 04 10:15:50 crc kubenswrapper[4776]: E1204 10:15:50.184515 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0dc27f-b4da-4930-9196-3c2d4c21aee2" containerName="collect-profiles" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.184537 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0dc27f-b4da-4930-9196-3c2d4c21aee2" containerName="collect-profiles" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.184731 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0dc27f-b4da-4930-9196-3c2d4c21aee2" containerName="collect-profiles" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.186085 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.199687 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dkln7"] Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.205638 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-utilities\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.205748 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z579d\" (UniqueName: \"kubernetes.io/projected/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-kube-api-access-z579d\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.205820 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-catalog-content\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.307621 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-utilities\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.307985 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z579d\" (UniqueName: \"kubernetes.io/projected/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-kube-api-access-z579d\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.308089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-utilities\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.308410 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-catalog-content\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.308745 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-catalog-content\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.327273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z579d\" (UniqueName: \"kubernetes.io/projected/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-kube-api-access-z579d\") pod \"community-operators-dkln7\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:50 crc kubenswrapper[4776]: I1204 10:15:50.506945 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:15:51 crc kubenswrapper[4776]: I1204 10:15:51.033767 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dkln7"] Dec 04 10:15:51 crc kubenswrapper[4776]: W1204 10:15:51.039057 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e34b743_f9c9_4dc1_b9de_bcaaf21f1ca9.slice/crio-6d7d41bf88a3430dfdcc3b1d2257d463c166f3f685569e9b42c48eb05d352cac WatchSource:0}: Error finding container 6d7d41bf88a3430dfdcc3b1d2257d463c166f3f685569e9b42c48eb05d352cac: Status 404 returned error can't find the container with id 6d7d41bf88a3430dfdcc3b1d2257d463c166f3f685569e9b42c48eb05d352cac Dec 04 10:15:51 crc kubenswrapper[4776]: I1204 10:15:51.118589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerStarted","Data":"6d7d41bf88a3430dfdcc3b1d2257d463c166f3f685569e9b42c48eb05d352cac"} Dec 04 10:15:52 crc kubenswrapper[4776]: I1204 10:15:52.130669 4776 generic.go:334] "Generic (PLEG): container finished" podID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerID="7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936" exitCode=0 Dec 04 10:15:52 crc kubenswrapper[4776]: I1204 10:15:52.130741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerDied","Data":"7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936"} Dec 04 10:15:52 crc kubenswrapper[4776]: I1204 10:15:52.134514 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:15:53 crc kubenswrapper[4776]: I1204 10:15:53.142211 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerStarted","Data":"267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d"} Dec 04 10:15:54 crc kubenswrapper[4776]: I1204 10:15:54.153307 4776 generic.go:334] "Generic (PLEG): container finished" podID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerID="267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d" exitCode=0 Dec 04 10:15:54 crc kubenswrapper[4776]: I1204 10:15:54.154290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerDied","Data":"267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d"} Dec 04 10:15:55 crc kubenswrapper[4776]: I1204 10:15:55.167139 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerStarted","Data":"acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae"} Dec 04 10:16:00 crc kubenswrapper[4776]: I1204 10:16:00.508031 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:16:00 crc kubenswrapper[4776]: I1204 10:16:00.508697 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:16:00 crc kubenswrapper[4776]: I1204 10:16:00.552342 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:16:00 crc kubenswrapper[4776]: I1204 10:16:00.581751 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dkln7" podStartSLOduration=7.941404535 podStartE2EDuration="10.581721171s" podCreationTimestamp="2025-12-04 10:15:50 +0000 UTC" firstStartedPulling="2025-12-04 10:15:52.134184526 +0000 UTC m=+2197.000664903" lastFinishedPulling="2025-12-04 10:15:54.774501162 +0000 UTC m=+2199.640981539" observedRunningTime="2025-12-04 10:15:55.193478325 +0000 UTC m=+2200.059958712" watchObservedRunningTime="2025-12-04 10:16:00.581721171 +0000 UTC m=+2205.448201548" Dec 04 10:16:01 crc kubenswrapper[4776]: I1204 10:16:01.271593 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:16:01 crc kubenswrapper[4776]: I1204 10:16:01.317854 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dkln7"] Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.246637 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dkln7" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="registry-server" containerID="cri-o://acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae" gracePeriod=2 Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.730569 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.764124 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z579d\" (UniqueName: \"kubernetes.io/projected/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-kube-api-access-z579d\") pod \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.764548 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-catalog-content\") pod \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.764650 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-utilities\") pod \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\" (UID: \"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9\") " Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.766586 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-utilities" (OuterVolumeSpecName: "utilities") pod "3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" (UID: "3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.770648 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-kube-api-access-z579d" (OuterVolumeSpecName: "kube-api-access-z579d") pod "3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" (UID: "3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9"). InnerVolumeSpecName "kube-api-access-z579d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.834161 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" (UID: "3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.867030 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.867077 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:03 crc kubenswrapper[4776]: I1204 10:16:03.867089 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z579d\" (UniqueName: \"kubernetes.io/projected/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9-kube-api-access-z579d\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.257229 4776 generic.go:334] "Generic (PLEG): container finished" podID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerID="acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae" exitCode=0 Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.257284 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerDied","Data":"acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae"} Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.257299 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dkln7" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.257341 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dkln7" event={"ID":"3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9","Type":"ContainerDied","Data":"6d7d41bf88a3430dfdcc3b1d2257d463c166f3f685569e9b42c48eb05d352cac"} Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.257398 4776 scope.go:117] "RemoveContainer" containerID="acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.277130 4776 scope.go:117] "RemoveContainer" containerID="267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.314161 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dkln7"] Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.315273 4776 scope.go:117] "RemoveContainer" containerID="7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.326325 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dkln7"] Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.339727 4776 scope.go:117] "RemoveContainer" containerID="acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae" Dec 04 10:16:04 crc kubenswrapper[4776]: E1204 10:16:04.340161 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae\": container with ID starting with acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae not found: ID does not exist" containerID="acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.340202 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae"} err="failed to get container status \"acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae\": rpc error: code = NotFound desc = could not find container \"acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae\": container with ID starting with acede488028c3b401a32f99992676ff31d18990c80a6aca87043136bc0a601ae not found: ID does not exist" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.340229 4776 scope.go:117] "RemoveContainer" containerID="267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d" Dec 04 10:16:04 crc kubenswrapper[4776]: E1204 10:16:04.340704 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d\": container with ID starting with 267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d not found: ID does not exist" containerID="267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.340747 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d"} err="failed to get container status \"267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d\": rpc error: code = NotFound desc = could not find container \"267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d\": container with ID starting with 267fd80eb5dc8fd98b284bcdfa22c85c3825ce454d505a641205c9ad973d6d8d not found: ID does not exist" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.340774 4776 scope.go:117] "RemoveContainer" containerID="7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936" Dec 04 10:16:04 crc kubenswrapper[4776]: E1204 10:16:04.341098 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936\": container with ID starting with 7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936 not found: ID does not exist" containerID="7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936" Dec 04 10:16:04 crc kubenswrapper[4776]: I1204 10:16:04.341120 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936"} err="failed to get container status \"7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936\": rpc error: code = NotFound desc = could not find container \"7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936\": container with ID starting with 7430df58c424ea7c13b12ad3278d6859150400d5706c7fa43311b5430605a936 not found: ID does not exist" Dec 04 10:16:05 crc kubenswrapper[4776]: I1204 10:16:05.463039 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" path="/var/lib/kubelet/pods/3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9/volumes" Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.380304 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.380896 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.380965 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.381716 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.381813 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" gracePeriod=600 Dec 04 10:16:19 crc kubenswrapper[4776]: E1204 10:16:19.501234 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.674952 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" exitCode=0 Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.675002 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572"} Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.675042 4776 scope.go:117] "RemoveContainer" containerID="2ef57bb754c7648e58638acbf6214793456368538049ee0eeb6ee6e04ffa60f3" Dec 04 10:16:19 crc kubenswrapper[4776]: I1204 10:16:19.675714 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:16:19 crc kubenswrapper[4776]: E1204 10:16:19.676027 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.515021 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2k52t"] Dec 04 10:16:29 crc kubenswrapper[4776]: E1204 10:16:29.516078 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="registry-server" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.516097 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="registry-server" Dec 04 10:16:29 crc kubenswrapper[4776]: E1204 10:16:29.516114 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="extract-content" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.516123 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="extract-content" Dec 04 10:16:29 crc kubenswrapper[4776]: E1204 10:16:29.516151 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="extract-utilities" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.516159 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="extract-utilities" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.516390 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e34b743-f9c9-4dc1-b9de-bcaaf21f1ca9" containerName="registry-server" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.518241 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.527689 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k52t"] Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.590177 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-catalog-content\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.590688 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtmb\" (UniqueName: \"kubernetes.io/projected/66a41678-4153-41dd-9da1-b385ef62048f-kube-api-access-cmtmb\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.590858 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-utilities\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.693680 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtmb\" (UniqueName: \"kubernetes.io/projected/66a41678-4153-41dd-9da1-b385ef62048f-kube-api-access-cmtmb\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.693764 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-utilities\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.693838 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-catalog-content\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.694363 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-utilities\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.694416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-catalog-content\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.724009 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtmb\" (UniqueName: \"kubernetes.io/projected/66a41678-4153-41dd-9da1-b385ef62048f-kube-api-access-cmtmb\") pod \"redhat-marketplace-2k52t\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:29 crc kubenswrapper[4776]: I1204 10:16:29.848148 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:30 crc kubenswrapper[4776]: I1204 10:16:30.337171 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k52t"] Dec 04 10:16:30 crc kubenswrapper[4776]: I1204 10:16:30.788277 4776 generic.go:334] "Generic (PLEG): container finished" podID="66a41678-4153-41dd-9da1-b385ef62048f" containerID="c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426" exitCode=0 Dec 04 10:16:30 crc kubenswrapper[4776]: I1204 10:16:30.788480 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerDied","Data":"c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426"} Dec 04 10:16:30 crc kubenswrapper[4776]: I1204 10:16:30.788851 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerStarted","Data":"a64075eb84d26e3b838bf862238ecbcade7f68bdaebdeadc700b0ecb0c81d979"} Dec 04 10:16:31 crc kubenswrapper[4776]: I1204 10:16:31.801148 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerStarted","Data":"91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873"} Dec 04 10:16:32 crc kubenswrapper[4776]: I1204 10:16:32.810336 4776 generic.go:334] "Generic (PLEG): container finished" podID="66a41678-4153-41dd-9da1-b385ef62048f" containerID="91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873" exitCode=0 Dec 04 10:16:32 crc kubenswrapper[4776]: I1204 10:16:32.810400 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerDied","Data":"91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873"} Dec 04 10:16:33 crc kubenswrapper[4776]: I1204 10:16:33.819984 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerStarted","Data":"85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c"} Dec 04 10:16:33 crc kubenswrapper[4776]: I1204 10:16:33.844479 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2k52t" podStartSLOduration=2.432162676 podStartE2EDuration="4.844455388s" podCreationTimestamp="2025-12-04 10:16:29 +0000 UTC" firstStartedPulling="2025-12-04 10:16:30.790347713 +0000 UTC m=+2235.656828090" lastFinishedPulling="2025-12-04 10:16:33.202640435 +0000 UTC m=+2238.069120802" observedRunningTime="2025-12-04 10:16:33.83814253 +0000 UTC m=+2238.704622927" watchObservedRunningTime="2025-12-04 10:16:33.844455388 +0000 UTC m=+2238.710935765" Dec 04 10:16:34 crc kubenswrapper[4776]: I1204 10:16:34.452340 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:16:34 crc kubenswrapper[4776]: E1204 10:16:34.452991 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:16:39 crc kubenswrapper[4776]: I1204 10:16:39.848393 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:39 crc kubenswrapper[4776]: I1204 10:16:39.849032 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:39 crc kubenswrapper[4776]: I1204 10:16:39.899310 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:39 crc kubenswrapper[4776]: I1204 10:16:39.957730 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:40 crc kubenswrapper[4776]: I1204 10:16:40.144554 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k52t"] Dec 04 10:16:41 crc kubenswrapper[4776]: I1204 10:16:41.899798 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2k52t" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="registry-server" containerID="cri-o://85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c" gracePeriod=2 Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.350366 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.382348 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtmb\" (UniqueName: \"kubernetes.io/projected/66a41678-4153-41dd-9da1-b385ef62048f-kube-api-access-cmtmb\") pod \"66a41678-4153-41dd-9da1-b385ef62048f\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.382609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-catalog-content\") pod \"66a41678-4153-41dd-9da1-b385ef62048f\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.382655 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-utilities\") pod \"66a41678-4153-41dd-9da1-b385ef62048f\" (UID: \"66a41678-4153-41dd-9da1-b385ef62048f\") " Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.383704 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-utilities" (OuterVolumeSpecName: "utilities") pod "66a41678-4153-41dd-9da1-b385ef62048f" (UID: "66a41678-4153-41dd-9da1-b385ef62048f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.389164 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a41678-4153-41dd-9da1-b385ef62048f-kube-api-access-cmtmb" (OuterVolumeSpecName: "kube-api-access-cmtmb") pod "66a41678-4153-41dd-9da1-b385ef62048f" (UID: "66a41678-4153-41dd-9da1-b385ef62048f"). InnerVolumeSpecName "kube-api-access-cmtmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.404507 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66a41678-4153-41dd-9da1-b385ef62048f" (UID: "66a41678-4153-41dd-9da1-b385ef62048f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.485110 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtmb\" (UniqueName: \"kubernetes.io/projected/66a41678-4153-41dd-9da1-b385ef62048f-kube-api-access-cmtmb\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.485156 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.485167 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66a41678-4153-41dd-9da1-b385ef62048f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.911582 4776 generic.go:334] "Generic (PLEG): container finished" podID="66a41678-4153-41dd-9da1-b385ef62048f" containerID="85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c" exitCode=0 Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.911636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerDied","Data":"85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c"} Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.911676 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k52t" event={"ID":"66a41678-4153-41dd-9da1-b385ef62048f","Type":"ContainerDied","Data":"a64075eb84d26e3b838bf862238ecbcade7f68bdaebdeadc700b0ecb0c81d979"} Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.911699 4776 scope.go:117] "RemoveContainer" containerID="85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.911715 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2k52t" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.944045 4776 scope.go:117] "RemoveContainer" containerID="91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873" Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.954191 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k52t"] Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.962698 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k52t"] Dec 04 10:16:42 crc kubenswrapper[4776]: I1204 10:16:42.977554 4776 scope.go:117] "RemoveContainer" containerID="c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.003345 4776 scope.go:117] "RemoveContainer" containerID="85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c" Dec 04 10:16:43 crc kubenswrapper[4776]: E1204 10:16:43.004016 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c\": container with ID starting with 85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c not found: ID does not exist" containerID="85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.004048 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c"} err="failed to get container status \"85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c\": rpc error: code = NotFound desc = could not find container \"85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c\": container with ID starting with 85e7c5d6c0e2e6192d163f199cc077584ff36520907e050c0ef97c729601fd3c not found: ID does not exist" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.004080 4776 scope.go:117] "RemoveContainer" containerID="91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873" Dec 04 10:16:43 crc kubenswrapper[4776]: E1204 10:16:43.004405 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873\": container with ID starting with 91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873 not found: ID does not exist" containerID="91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.004434 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873"} err="failed to get container status \"91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873\": rpc error: code = NotFound desc = could not find container \"91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873\": container with ID starting with 91d160e7b7a47e684096fffc4794ad71f4ef60b1fccb6fca3b9aa09ec3a50873 not found: ID does not exist" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.004453 4776 scope.go:117] "RemoveContainer" containerID="c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426" Dec 04 10:16:43 crc kubenswrapper[4776]: E1204 10:16:43.004887 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426\": container with ID starting with c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426 not found: ID does not exist" containerID="c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.004957 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426"} err="failed to get container status \"c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426\": rpc error: code = NotFound desc = could not find container \"c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426\": container with ID starting with c2fdd397f4e2fa5be0fa109b72c595857956a66405e57c413122695884048426 not found: ID does not exist" Dec 04 10:16:43 crc kubenswrapper[4776]: I1204 10:16:43.462792 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a41678-4153-41dd-9da1-b385ef62048f" path="/var/lib/kubelet/pods/66a41678-4153-41dd-9da1-b385ef62048f/volumes" Dec 04 10:16:49 crc kubenswrapper[4776]: I1204 10:16:49.454289 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:16:49 crc kubenswrapper[4776]: E1204 10:16:49.455061 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:17:00 crc kubenswrapper[4776]: I1204 10:17:00.453883 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:17:00 crc kubenswrapper[4776]: E1204 10:17:00.454705 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:17:14 crc kubenswrapper[4776]: I1204 10:17:14.452067 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:17:14 crc kubenswrapper[4776]: E1204 10:17:14.452947 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:17:25 crc kubenswrapper[4776]: I1204 10:17:25.466032 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:17:25 crc kubenswrapper[4776]: E1204 10:17:25.467075 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:17:38 crc kubenswrapper[4776]: I1204 10:17:38.452108 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:17:38 crc kubenswrapper[4776]: E1204 10:17:38.452811 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:17:49 crc kubenswrapper[4776]: I1204 10:17:49.452950 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:17:49 crc kubenswrapper[4776]: E1204 10:17:49.453756 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.025678 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.039352 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.047749 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gvhcw"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.054752 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mts5t"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.061237 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mts5t"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.067863 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.074770 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.081255 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-494qn"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.087570 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.094735 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.101591 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.108555 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.115848 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.124858 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5flb7"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.132519 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l4jmp"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.139839 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v76f2"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.148800 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z7zdv"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.156348 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-knqvn"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.163524 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c55fq"] Dec 04 10:17:52 crc kubenswrapper[4776]: I1204 10:17:52.171132 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hhn6"] Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.469579 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a4c21c-5146-4d8d-9c2a-13c081f134c8" path="/var/lib/kubelet/pods/13a4c21c-5146-4d8d-9c2a-13c081f134c8/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.471142 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29858f05-6bc6-4f33-ae7e-e65c4737eed7" path="/var/lib/kubelet/pods/29858f05-6bc6-4f33-ae7e-e65c4737eed7/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.471742 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1bd233-1d50-4916-8735-65b29b8dac15" path="/var/lib/kubelet/pods/2f1bd233-1d50-4916-8735-65b29b8dac15/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.472409 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4282b94f-74cf-4ce8-9ab1-7235dfa27a56" path="/var/lib/kubelet/pods/4282b94f-74cf-4ce8-9ab1-7235dfa27a56/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.473749 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44079ddd-2a33-421b-a2ba-359a08689df6" path="/var/lib/kubelet/pods/44079ddd-2a33-421b-a2ba-359a08689df6/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.474659 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d88dbcd-c42d-49c5-b71d-6b90735ba4fb" path="/var/lib/kubelet/pods/4d88dbcd-c42d-49c5-b71d-6b90735ba4fb/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.475504 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0203c8-396c-4504-904c-18a82d237314" path="/var/lib/kubelet/pods/8d0203c8-396c-4504-904c-18a82d237314/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.476778 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef0edf3-6a05-41cf-af58-f98b8b7492fa" path="/var/lib/kubelet/pods/8ef0edf3-6a05-41cf-af58-f98b8b7492fa/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.477633 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3421cf4-b5ee-45c2-a236-02d6ae85f719" path="/var/lib/kubelet/pods/d3421cf4-b5ee-45c2-a236-02d6ae85f719/volumes" Dec 04 10:17:53 crc kubenswrapper[4776]: I1204 10:17:53.478518 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f802e144-1ad6-4d7c-a0fb-5935d8ae53d8" path="/var/lib/kubelet/pods/f802e144-1ad6-4d7c-a0fb-5935d8ae53d8/volumes" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.343072 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9"] Dec 04 10:17:58 crc kubenswrapper[4776]: E1204 10:17:58.344579 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="extract-content" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.344596 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="extract-content" Dec 04 10:17:58 crc kubenswrapper[4776]: E1204 10:17:58.344619 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="extract-utilities" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.344626 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="extract-utilities" Dec 04 10:17:58 crc kubenswrapper[4776]: E1204 10:17:58.344676 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="registry-server" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.344686 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="registry-server" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.345169 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a41678-4153-41dd-9da1-b385ef62048f" containerName="registry-server" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.347084 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.354514 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.354810 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.354953 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.355136 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.355454 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.357052 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9"] Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.544453 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.544540 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.544589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbqd\" (UniqueName: \"kubernetes.io/projected/6ac53b67-6fc4-413b-b712-80cc35fd786e-kube-api-access-9vbqd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.545317 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.545607 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.647578 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.647653 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.648620 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.648646 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.648665 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbqd\" (UniqueName: \"kubernetes.io/projected/6ac53b67-6fc4-413b-b712-80cc35fd786e-kube-api-access-9vbqd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.654765 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.655174 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.655798 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.661616 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.672050 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbqd\" (UniqueName: \"kubernetes.io/projected/6ac53b67-6fc4-413b-b712-80cc35fd786e-kube-api-access-9vbqd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:58 crc kubenswrapper[4776]: I1204 10:17:58.679935 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:17:59 crc kubenswrapper[4776]: I1204 10:17:59.255358 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9"] Dec 04 10:17:59 crc kubenswrapper[4776]: I1204 10:17:59.546458 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" event={"ID":"6ac53b67-6fc4-413b-b712-80cc35fd786e","Type":"ContainerStarted","Data":"09d90ffd8197b4bf91c7ddc2360527d93f73d0537cb5e8c693bcc81d1c03ec3b"} Dec 04 10:18:00 crc kubenswrapper[4776]: I1204 10:18:00.557227 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" event={"ID":"6ac53b67-6fc4-413b-b712-80cc35fd786e","Type":"ContainerStarted","Data":"eaeb001ad7eb1e3c952a2039c18738b15cfe3988d6ee55a8c41e03624156128e"} Dec 04 10:18:00 crc kubenswrapper[4776]: I1204 10:18:00.582881 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" podStartSLOduration=2.145967314 podStartE2EDuration="2.582862679s" podCreationTimestamp="2025-12-04 10:17:58 +0000 UTC" firstStartedPulling="2025-12-04 10:17:59.261733646 +0000 UTC m=+2324.128214023" lastFinishedPulling="2025-12-04 10:17:59.698629011 +0000 UTC m=+2324.565109388" observedRunningTime="2025-12-04 10:18:00.576942753 +0000 UTC m=+2325.443423140" watchObservedRunningTime="2025-12-04 10:18:00.582862679 +0000 UTC m=+2325.449343056" Dec 04 10:18:01 crc kubenswrapper[4776]: I1204 10:18:01.452485 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:18:01 crc kubenswrapper[4776]: E1204 10:18:01.452956 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:18:11 crc kubenswrapper[4776]: I1204 10:18:11.657514 4776 generic.go:334] "Generic (PLEG): container finished" podID="6ac53b67-6fc4-413b-b712-80cc35fd786e" containerID="eaeb001ad7eb1e3c952a2039c18738b15cfe3988d6ee55a8c41e03624156128e" exitCode=0 Dec 04 10:18:11 crc kubenswrapper[4776]: I1204 10:18:11.657579 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" event={"ID":"6ac53b67-6fc4-413b-b712-80cc35fd786e","Type":"ContainerDied","Data":"eaeb001ad7eb1e3c952a2039c18738b15cfe3988d6ee55a8c41e03624156128e"} Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.068808 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.225141 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ssh-key\") pod \"6ac53b67-6fc4-413b-b712-80cc35fd786e\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.225204 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ceph\") pod \"6ac53b67-6fc4-413b-b712-80cc35fd786e\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.225267 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-repo-setup-combined-ca-bundle\") pod \"6ac53b67-6fc4-413b-b712-80cc35fd786e\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.225492 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-inventory\") pod \"6ac53b67-6fc4-413b-b712-80cc35fd786e\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.225705 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vbqd\" (UniqueName: \"kubernetes.io/projected/6ac53b67-6fc4-413b-b712-80cc35fd786e-kube-api-access-9vbqd\") pod \"6ac53b67-6fc4-413b-b712-80cc35fd786e\" (UID: \"6ac53b67-6fc4-413b-b712-80cc35fd786e\") " Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.231049 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ceph" (OuterVolumeSpecName: "ceph") pod "6ac53b67-6fc4-413b-b712-80cc35fd786e" (UID: "6ac53b67-6fc4-413b-b712-80cc35fd786e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.232828 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac53b67-6fc4-413b-b712-80cc35fd786e-kube-api-access-9vbqd" (OuterVolumeSpecName: "kube-api-access-9vbqd") pod "6ac53b67-6fc4-413b-b712-80cc35fd786e" (UID: "6ac53b67-6fc4-413b-b712-80cc35fd786e"). InnerVolumeSpecName "kube-api-access-9vbqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.236138 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6ac53b67-6fc4-413b-b712-80cc35fd786e" (UID: "6ac53b67-6fc4-413b-b712-80cc35fd786e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.257069 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ac53b67-6fc4-413b-b712-80cc35fd786e" (UID: "6ac53b67-6fc4-413b-b712-80cc35fd786e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.257521 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-inventory" (OuterVolumeSpecName: "inventory") pod "6ac53b67-6fc4-413b-b712-80cc35fd786e" (UID: "6ac53b67-6fc4-413b-b712-80cc35fd786e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.328434 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.328487 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.328501 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.328518 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ac53b67-6fc4-413b-b712-80cc35fd786e-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.328532 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vbqd\" (UniqueName: \"kubernetes.io/projected/6ac53b67-6fc4-413b-b712-80cc35fd786e-kube-api-access-9vbqd\") on node \"crc\" DevicePath \"\"" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.680679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" event={"ID":"6ac53b67-6fc4-413b-b712-80cc35fd786e","Type":"ContainerDied","Data":"09d90ffd8197b4bf91c7ddc2360527d93f73d0537cb5e8c693bcc81d1c03ec3b"} Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.680724 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d90ffd8197b4bf91c7ddc2360527d93f73d0537cb5e8c693bcc81d1c03ec3b" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.680761 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.758963 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78"] Dec 04 10:18:13 crc kubenswrapper[4776]: E1204 10:18:13.759435 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac53b67-6fc4-413b-b712-80cc35fd786e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.759459 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac53b67-6fc4-413b-b712-80cc35fd786e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.759665 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac53b67-6fc4-413b-b712-80cc35fd786e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.760380 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.762953 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.763238 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.763452 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.763536 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.763960 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.768044 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78"] Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.941392 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.941452 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.942311 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6wc9\" (UniqueName: \"kubernetes.io/projected/472ce27b-24c6-4557-9775-971817286847-kube-api-access-l6wc9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.942435 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:13 crc kubenswrapper[4776]: I1204 10:18:13.942690 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.044811 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6wc9\" (UniqueName: \"kubernetes.io/projected/472ce27b-24c6-4557-9775-971817286847-kube-api-access-l6wc9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.044884 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.044970 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.045004 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.045036 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.049273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.049435 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.049583 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.056755 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.061718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6wc9\" (UniqueName: \"kubernetes.io/projected/472ce27b-24c6-4557-9775-971817286847-kube-api-access-l6wc9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.084018 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:18:14 crc kubenswrapper[4776]: I1204 10:18:14.749050 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78"] Dec 04 10:18:15 crc kubenswrapper[4776]: I1204 10:18:15.459504 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:18:15 crc kubenswrapper[4776]: E1204 10:18:15.460209 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:18:15 crc kubenswrapper[4776]: I1204 10:18:15.739666 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" event={"ID":"472ce27b-24c6-4557-9775-971817286847","Type":"ContainerStarted","Data":"b1b3543e2aa0b9e66e8a91179f30c69503f17e4a63a61008ff716e0d1f7cb582"} Dec 04 10:18:16 crc kubenswrapper[4776]: I1204 10:18:16.751620 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" event={"ID":"472ce27b-24c6-4557-9775-971817286847","Type":"ContainerStarted","Data":"500f6b990f8c77344445629c4d438a7330df9b2965abbb5c9ce75bf7c6be7bde"} Dec 04 10:18:21 crc kubenswrapper[4776]: I1204 10:18:21.953407 4776 scope.go:117] "RemoveContainer" containerID="2d1e1295148b75dc91863a4f5ba4654a148e598567b907ca1769f7bf41e7f28c" Dec 04 10:18:21 crc kubenswrapper[4776]: I1204 10:18:21.994433 4776 scope.go:117] "RemoveContainer" containerID="fa1993096d4e071b0bfe8f7ba825de1f6c6fa31bc2325cbc97d6884162d7f6de" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.032466 4776 scope.go:117] "RemoveContainer" containerID="f73b9777636353f09e541a566d467454158aa7ee2bc3a628d9384a5742545ba5" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.089328 4776 scope.go:117] "RemoveContainer" containerID="0b86543dd5d5cfdb3f17829b330b5f97d6f4a697c5a8f31f24fc69a134c31fb1" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.138671 4776 scope.go:117] "RemoveContainer" containerID="bef53880ab3e659ae3ee6f5191921c88415b9a76ebec0c7859d274b0dcb7cfbd" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.205028 4776 scope.go:117] "RemoveContainer" containerID="b26a21f0298a452e5e8f5c89ebc31492bf5c8bc6778fc335da75bbe49d982bf7" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.266984 4776 scope.go:117] "RemoveContainer" containerID="5fae47c28074ecddc2734307cf27e846ef61d3d1c86eb77b19d32ac4a5c9c935" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.298737 4776 scope.go:117] "RemoveContainer" containerID="c3e3f214157fa63d0bc680ca0d118b2f914cb195604dce268b25f913f5b55f83" Dec 04 10:18:22 crc kubenswrapper[4776]: I1204 10:18:22.331775 4776 scope.go:117] "RemoveContainer" containerID="cb95d2daacf74d0d9dd89b10049914f35825e3efe76a2ccfe35320a9b10f9162" Dec 04 10:18:29 crc kubenswrapper[4776]: I1204 10:18:29.453029 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:18:29 crc kubenswrapper[4776]: E1204 10:18:29.453640 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:18:40 crc kubenswrapper[4776]: I1204 10:18:40.480535 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:18:40 crc kubenswrapper[4776]: E1204 10:18:40.482368 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:18:52 crc kubenswrapper[4776]: I1204 10:18:52.452305 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:18:52 crc kubenswrapper[4776]: E1204 10:18:52.453055 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:19:06 crc kubenswrapper[4776]: I1204 10:19:06.452589 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:19:06 crc kubenswrapper[4776]: E1204 10:19:06.453453 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:19:20 crc kubenswrapper[4776]: I1204 10:19:20.452126 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:19:20 crc kubenswrapper[4776]: E1204 10:19:20.452812 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:19:22 crc kubenswrapper[4776]: I1204 10:19:22.541173 4776 scope.go:117] "RemoveContainer" containerID="5d2cedd2be5a6ffc816bc3117d5b566cbff7772a0f331660defb9e15af676bab" Dec 04 10:19:31 crc kubenswrapper[4776]: I1204 10:19:31.452541 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:19:31 crc kubenswrapper[4776]: E1204 10:19:31.453261 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:19:46 crc kubenswrapper[4776]: I1204 10:19:46.451749 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:19:46 crc kubenswrapper[4776]: E1204 10:19:46.452570 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:20:00 crc kubenswrapper[4776]: I1204 10:20:00.453016 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:20:00 crc kubenswrapper[4776]: E1204 10:20:00.453863 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:20:03 crc kubenswrapper[4776]: I1204 10:20:03.699290 4776 generic.go:334] "Generic (PLEG): container finished" podID="472ce27b-24c6-4557-9775-971817286847" containerID="500f6b990f8c77344445629c4d438a7330df9b2965abbb5c9ce75bf7c6be7bde" exitCode=0 Dec 04 10:20:03 crc kubenswrapper[4776]: I1204 10:20:03.699377 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" event={"ID":"472ce27b-24c6-4557-9775-971817286847","Type":"ContainerDied","Data":"500f6b990f8c77344445629c4d438a7330df9b2965abbb5c9ce75bf7c6be7bde"} Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.110930 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.178508 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-bootstrap-combined-ca-bundle\") pod \"472ce27b-24c6-4557-9775-971817286847\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.178759 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ceph\") pod \"472ce27b-24c6-4557-9775-971817286847\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.179312 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ssh-key\") pod \"472ce27b-24c6-4557-9775-971817286847\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.179675 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6wc9\" (UniqueName: \"kubernetes.io/projected/472ce27b-24c6-4557-9775-971817286847-kube-api-access-l6wc9\") pod \"472ce27b-24c6-4557-9775-971817286847\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.179745 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-inventory\") pod \"472ce27b-24c6-4557-9775-971817286847\" (UID: \"472ce27b-24c6-4557-9775-971817286847\") " Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.185839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "472ce27b-24c6-4557-9775-971817286847" (UID: "472ce27b-24c6-4557-9775-971817286847"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.186026 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ceph" (OuterVolumeSpecName: "ceph") pod "472ce27b-24c6-4557-9775-971817286847" (UID: "472ce27b-24c6-4557-9775-971817286847"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.186418 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472ce27b-24c6-4557-9775-971817286847-kube-api-access-l6wc9" (OuterVolumeSpecName: "kube-api-access-l6wc9") pod "472ce27b-24c6-4557-9775-971817286847" (UID: "472ce27b-24c6-4557-9775-971817286847"). InnerVolumeSpecName "kube-api-access-l6wc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.210385 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "472ce27b-24c6-4557-9775-971817286847" (UID: "472ce27b-24c6-4557-9775-971817286847"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.212386 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-inventory" (OuterVolumeSpecName: "inventory") pod "472ce27b-24c6-4557-9775-971817286847" (UID: "472ce27b-24c6-4557-9775-971817286847"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.282993 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6wc9\" (UniqueName: \"kubernetes.io/projected/472ce27b-24c6-4557-9775-971817286847-kube-api-access-l6wc9\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.283308 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.283392 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.283466 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.283543 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/472ce27b-24c6-4557-9775-971817286847-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.721025 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" event={"ID":"472ce27b-24c6-4557-9775-971817286847","Type":"ContainerDied","Data":"b1b3543e2aa0b9e66e8a91179f30c69503f17e4a63a61008ff716e0d1f7cb582"} Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.721090 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b3543e2aa0b9e66e8a91179f30c69503f17e4a63a61008ff716e0d1f7cb582" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.721106 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.816429 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk"] Dec 04 10:20:05 crc kubenswrapper[4776]: E1204 10:20:05.817150 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472ce27b-24c6-4557-9775-971817286847" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.817178 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="472ce27b-24c6-4557-9775-971817286847" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.817406 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="472ce27b-24c6-4557-9775-971817286847" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.818549 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.822894 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.822940 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.823288 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.823713 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.823970 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.830606 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk"] Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.894245 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964p9\" (UniqueName: \"kubernetes.io/projected/59d725b6-d177-4b01-a89e-8fca3d2127ae-kube-api-access-964p9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.894369 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.894425 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.894710 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.996452 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.996538 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964p9\" (UniqueName: \"kubernetes.io/projected/59d725b6-d177-4b01-a89e-8fca3d2127ae-kube-api-access-964p9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.997271 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:05 crc kubenswrapper[4776]: I1204 10:20:05.997547 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:06 crc kubenswrapper[4776]: I1204 10:20:06.006644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:06 crc kubenswrapper[4776]: I1204 10:20:06.006702 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:06 crc kubenswrapper[4776]: I1204 10:20:06.031213 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:06 crc kubenswrapper[4776]: I1204 10:20:06.039957 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964p9\" (UniqueName: \"kubernetes.io/projected/59d725b6-d177-4b01-a89e-8fca3d2127ae-kube-api-access-964p9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:06 crc kubenswrapper[4776]: I1204 10:20:06.143263 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:06 crc kubenswrapper[4776]: I1204 10:20:06.948784 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk"] Dec 04 10:20:07 crc kubenswrapper[4776]: I1204 10:20:07.745475 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" event={"ID":"59d725b6-d177-4b01-a89e-8fca3d2127ae","Type":"ContainerStarted","Data":"ecff1241e94813f2452193ec234ecd68acd2dc916573fceb710cb7097822ed18"} Dec 04 10:20:08 crc kubenswrapper[4776]: I1204 10:20:08.756929 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" event={"ID":"59d725b6-d177-4b01-a89e-8fca3d2127ae","Type":"ContainerStarted","Data":"fd32f07be461934d37c16f0e4e03eef72e5bc2ac121c32492028fb6645c4fee9"} Dec 04 10:20:08 crc kubenswrapper[4776]: I1204 10:20:08.778039 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" podStartSLOduration=3.109242578 podStartE2EDuration="3.778016623s" podCreationTimestamp="2025-12-04 10:20:05 +0000 UTC" firstStartedPulling="2025-12-04 10:20:06.960569407 +0000 UTC m=+2451.827049784" lastFinishedPulling="2025-12-04 10:20:07.629343442 +0000 UTC m=+2452.495823829" observedRunningTime="2025-12-04 10:20:08.769502156 +0000 UTC m=+2453.635982533" watchObservedRunningTime="2025-12-04 10:20:08.778016623 +0000 UTC m=+2453.644497010" Dec 04 10:20:12 crc kubenswrapper[4776]: I1204 10:20:12.452701 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:20:12 crc kubenswrapper[4776]: E1204 10:20:12.453236 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:20:27 crc kubenswrapper[4776]: I1204 10:20:27.452294 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:20:27 crc kubenswrapper[4776]: E1204 10:20:27.453004 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:20:32 crc kubenswrapper[4776]: I1204 10:20:32.947786 4776 generic.go:334] "Generic (PLEG): container finished" podID="59d725b6-d177-4b01-a89e-8fca3d2127ae" containerID="fd32f07be461934d37c16f0e4e03eef72e5bc2ac121c32492028fb6645c4fee9" exitCode=0 Dec 04 10:20:32 crc kubenswrapper[4776]: I1204 10:20:32.947834 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" event={"ID":"59d725b6-d177-4b01-a89e-8fca3d2127ae","Type":"ContainerDied","Data":"fd32f07be461934d37c16f0e4e03eef72e5bc2ac121c32492028fb6645c4fee9"} Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.368572 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.457562 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ssh-key\") pod \"59d725b6-d177-4b01-a89e-8fca3d2127ae\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.457616 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-964p9\" (UniqueName: \"kubernetes.io/projected/59d725b6-d177-4b01-a89e-8fca3d2127ae-kube-api-access-964p9\") pod \"59d725b6-d177-4b01-a89e-8fca3d2127ae\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.457649 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ceph\") pod \"59d725b6-d177-4b01-a89e-8fca3d2127ae\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.457724 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-inventory\") pod \"59d725b6-d177-4b01-a89e-8fca3d2127ae\" (UID: \"59d725b6-d177-4b01-a89e-8fca3d2127ae\") " Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.463750 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d725b6-d177-4b01-a89e-8fca3d2127ae-kube-api-access-964p9" (OuterVolumeSpecName: "kube-api-access-964p9") pod "59d725b6-d177-4b01-a89e-8fca3d2127ae" (UID: "59d725b6-d177-4b01-a89e-8fca3d2127ae"). InnerVolumeSpecName "kube-api-access-964p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.463765 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ceph" (OuterVolumeSpecName: "ceph") pod "59d725b6-d177-4b01-a89e-8fca3d2127ae" (UID: "59d725b6-d177-4b01-a89e-8fca3d2127ae"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.488644 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-inventory" (OuterVolumeSpecName: "inventory") pod "59d725b6-d177-4b01-a89e-8fca3d2127ae" (UID: "59d725b6-d177-4b01-a89e-8fca3d2127ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.491986 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59d725b6-d177-4b01-a89e-8fca3d2127ae" (UID: "59d725b6-d177-4b01-a89e-8fca3d2127ae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.560060 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.560109 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.560126 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d725b6-d177-4b01-a89e-8fca3d2127ae-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.560137 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-964p9\" (UniqueName: \"kubernetes.io/projected/59d725b6-d177-4b01-a89e-8fca3d2127ae-kube-api-access-964p9\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.964535 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" event={"ID":"59d725b6-d177-4b01-a89e-8fca3d2127ae","Type":"ContainerDied","Data":"ecff1241e94813f2452193ec234ecd68acd2dc916573fceb710cb7097822ed18"} Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.964579 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecff1241e94813f2452193ec234ecd68acd2dc916573fceb710cb7097822ed18" Dec 04 10:20:34 crc kubenswrapper[4776]: I1204 10:20:34.964582 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.049067 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp"] Dec 04 10:20:35 crc kubenswrapper[4776]: E1204 10:20:35.049818 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d725b6-d177-4b01-a89e-8fca3d2127ae" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.049849 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d725b6-d177-4b01-a89e-8fca3d2127ae" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.050091 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d725b6-d177-4b01-a89e-8fca3d2127ae" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.050780 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.055823 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.056672 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.056743 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.061233 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.061840 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.072462 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp"] Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.170904 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdj9\" (UniqueName: \"kubernetes.io/projected/9836203f-04e7-4179-b4fa-8e133dbe8e5a-kube-api-access-2cdj9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.170997 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.171171 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.171191 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.272802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.272870 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.272952 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdj9\" (UniqueName: \"kubernetes.io/projected/9836203f-04e7-4179-b4fa-8e133dbe8e5a-kube-api-access-2cdj9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.272987 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.278146 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.285575 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.286203 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.291290 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdj9\" (UniqueName: \"kubernetes.io/projected/9836203f-04e7-4179-b4fa-8e133dbe8e5a-kube-api-access-2cdj9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9grzp\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.372608 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:35 crc kubenswrapper[4776]: W1204 10:20:35.881678 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9836203f_04e7_4179_b4fa_8e133dbe8e5a.slice/crio-9da9545bddf199d8f0bce9b6bda67abc8f02460ef971bc730a3d882cdf757407 WatchSource:0}: Error finding container 9da9545bddf199d8f0bce9b6bda67abc8f02460ef971bc730a3d882cdf757407: Status 404 returned error can't find the container with id 9da9545bddf199d8f0bce9b6bda67abc8f02460ef971bc730a3d882cdf757407 Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.883306 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp"] Dec 04 10:20:35 crc kubenswrapper[4776]: I1204 10:20:35.974029 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" event={"ID":"9836203f-04e7-4179-b4fa-8e133dbe8e5a","Type":"ContainerStarted","Data":"9da9545bddf199d8f0bce9b6bda67abc8f02460ef971bc730a3d882cdf757407"} Dec 04 10:20:37 crc kubenswrapper[4776]: I1204 10:20:37.010314 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" event={"ID":"9836203f-04e7-4179-b4fa-8e133dbe8e5a","Type":"ContainerStarted","Data":"6008e7c0725379783f8af6c203e1bf5fe36c4646b300b681dc29df64bf544095"} Dec 04 10:20:37 crc kubenswrapper[4776]: I1204 10:20:37.027226 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" podStartSLOduration=1.6458297800000001 podStartE2EDuration="2.027205814s" podCreationTimestamp="2025-12-04 10:20:35 +0000 UTC" firstStartedPulling="2025-12-04 10:20:35.88674437 +0000 UTC m=+2480.753224747" lastFinishedPulling="2025-12-04 10:20:36.268120404 +0000 UTC m=+2481.134600781" observedRunningTime="2025-12-04 10:20:37.024590222 +0000 UTC m=+2481.891070599" watchObservedRunningTime="2025-12-04 10:20:37.027205814 +0000 UTC m=+2481.893686191" Dec 04 10:20:40 crc kubenswrapper[4776]: I1204 10:20:40.453792 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:20:40 crc kubenswrapper[4776]: E1204 10:20:40.454568 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:20:42 crc kubenswrapper[4776]: I1204 10:20:42.130586 4776 generic.go:334] "Generic (PLEG): container finished" podID="9836203f-04e7-4179-b4fa-8e133dbe8e5a" containerID="6008e7c0725379783f8af6c203e1bf5fe36c4646b300b681dc29df64bf544095" exitCode=0 Dec 04 10:20:42 crc kubenswrapper[4776]: I1204 10:20:42.131290 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" event={"ID":"9836203f-04e7-4179-b4fa-8e133dbe8e5a","Type":"ContainerDied","Data":"6008e7c0725379783f8af6c203e1bf5fe36c4646b300b681dc29df64bf544095"} Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.604727 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.779006 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-inventory\") pod \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.779370 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ceph\") pod \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.779397 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ssh-key\") pod \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.779449 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdj9\" (UniqueName: \"kubernetes.io/projected/9836203f-04e7-4179-b4fa-8e133dbe8e5a-kube-api-access-2cdj9\") pod \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\" (UID: \"9836203f-04e7-4179-b4fa-8e133dbe8e5a\") " Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.786984 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ceph" (OuterVolumeSpecName: "ceph") pod "9836203f-04e7-4179-b4fa-8e133dbe8e5a" (UID: "9836203f-04e7-4179-b4fa-8e133dbe8e5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.793061 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9836203f-04e7-4179-b4fa-8e133dbe8e5a-kube-api-access-2cdj9" (OuterVolumeSpecName: "kube-api-access-2cdj9") pod "9836203f-04e7-4179-b4fa-8e133dbe8e5a" (UID: "9836203f-04e7-4179-b4fa-8e133dbe8e5a"). InnerVolumeSpecName "kube-api-access-2cdj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.806655 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-inventory" (OuterVolumeSpecName: "inventory") pod "9836203f-04e7-4179-b4fa-8e133dbe8e5a" (UID: "9836203f-04e7-4179-b4fa-8e133dbe8e5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.811225 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9836203f-04e7-4179-b4fa-8e133dbe8e5a" (UID: "9836203f-04e7-4179-b4fa-8e133dbe8e5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.882131 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.882177 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.882193 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdj9\" (UniqueName: \"kubernetes.io/projected/9836203f-04e7-4179-b4fa-8e133dbe8e5a-kube-api-access-2cdj9\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:43 crc kubenswrapper[4776]: I1204 10:20:43.882204 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9836203f-04e7-4179-b4fa-8e133dbe8e5a-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.168457 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" event={"ID":"9836203f-04e7-4179-b4fa-8e133dbe8e5a","Type":"ContainerDied","Data":"9da9545bddf199d8f0bce9b6bda67abc8f02460ef971bc730a3d882cdf757407"} Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.168522 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9da9545bddf199d8f0bce9b6bda67abc8f02460ef971bc730a3d882cdf757407" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.168535 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9grzp" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.240594 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5"] Dec 04 10:20:44 crc kubenswrapper[4776]: E1204 10:20:44.241150 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9836203f-04e7-4179-b4fa-8e133dbe8e5a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.241176 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9836203f-04e7-4179-b4fa-8e133dbe8e5a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.241429 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9836203f-04e7-4179-b4fa-8e133dbe8e5a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.242238 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.244533 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.245507 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.245664 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.245882 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.249658 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.251502 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5"] Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.390332 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.390407 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.390445 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfw2s\" (UniqueName: \"kubernetes.io/projected/4696f658-d3e7-4aee-9569-80a393613cb9-kube-api-access-pfw2s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.390544 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.492638 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.492728 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.492776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfw2s\" (UniqueName: \"kubernetes.io/projected/4696f658-d3e7-4aee-9569-80a393613cb9-kube-api-access-pfw2s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.492852 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.496902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.497249 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.498487 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.510636 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfw2s\" (UniqueName: \"kubernetes.io/projected/4696f658-d3e7-4aee-9569-80a393613cb9-kube-api-access-pfw2s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zhww5\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:44 crc kubenswrapper[4776]: I1204 10:20:44.559597 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:20:45 crc kubenswrapper[4776]: I1204 10:20:45.133880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5"] Dec 04 10:20:45 crc kubenswrapper[4776]: I1204 10:20:45.177010 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" event={"ID":"4696f658-d3e7-4aee-9569-80a393613cb9","Type":"ContainerStarted","Data":"1d18af2194c92a88ce183fcaecaa2f2127c6f5dd8d9b780fba70b99b1bf4b7e1"} Dec 04 10:20:47 crc kubenswrapper[4776]: I1204 10:20:47.192205 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" event={"ID":"4696f658-d3e7-4aee-9569-80a393613cb9","Type":"ContainerStarted","Data":"525dac27b7944c5ff6c34caa197780fd2a8da9bf3788c6936262be10bb4c48f2"} Dec 04 10:20:47 crc kubenswrapper[4776]: I1204 10:20:47.215095 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" podStartSLOduration=1.719792618 podStartE2EDuration="3.215078021s" podCreationTimestamp="2025-12-04 10:20:44 +0000 UTC" firstStartedPulling="2025-12-04 10:20:45.142120032 +0000 UTC m=+2490.008600409" lastFinishedPulling="2025-12-04 10:20:46.637405435 +0000 UTC m=+2491.503885812" observedRunningTime="2025-12-04 10:20:47.211655603 +0000 UTC m=+2492.078136000" watchObservedRunningTime="2025-12-04 10:20:47.215078021 +0000 UTC m=+2492.081558398" Dec 04 10:20:53 crc kubenswrapper[4776]: I1204 10:20:53.452165 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:20:53 crc kubenswrapper[4776]: E1204 10:20:53.452980 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:21:06 crc kubenswrapper[4776]: I1204 10:21:06.453271 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:21:06 crc kubenswrapper[4776]: E1204 10:21:06.456206 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:21:18 crc kubenswrapper[4776]: I1204 10:21:18.452479 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:21:18 crc kubenswrapper[4776]: E1204 10:21:18.453450 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:21:21 crc kubenswrapper[4776]: I1204 10:21:21.547502 4776 generic.go:334] "Generic (PLEG): container finished" podID="4696f658-d3e7-4aee-9569-80a393613cb9" containerID="525dac27b7944c5ff6c34caa197780fd2a8da9bf3788c6936262be10bb4c48f2" exitCode=0 Dec 04 10:21:21 crc kubenswrapper[4776]: I1204 10:21:21.547619 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" event={"ID":"4696f658-d3e7-4aee-9569-80a393613cb9","Type":"ContainerDied","Data":"525dac27b7944c5ff6c34caa197780fd2a8da9bf3788c6936262be10bb4c48f2"} Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.013518 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.105233 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ssh-key\") pod \"4696f658-d3e7-4aee-9569-80a393613cb9\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.105594 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ceph\") pod \"4696f658-d3e7-4aee-9569-80a393613cb9\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.105667 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-inventory\") pod \"4696f658-d3e7-4aee-9569-80a393613cb9\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.105733 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfw2s\" (UniqueName: \"kubernetes.io/projected/4696f658-d3e7-4aee-9569-80a393613cb9-kube-api-access-pfw2s\") pod \"4696f658-d3e7-4aee-9569-80a393613cb9\" (UID: \"4696f658-d3e7-4aee-9569-80a393613cb9\") " Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.111640 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4696f658-d3e7-4aee-9569-80a393613cb9-kube-api-access-pfw2s" (OuterVolumeSpecName: "kube-api-access-pfw2s") pod "4696f658-d3e7-4aee-9569-80a393613cb9" (UID: "4696f658-d3e7-4aee-9569-80a393613cb9"). InnerVolumeSpecName "kube-api-access-pfw2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.113380 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ceph" (OuterVolumeSpecName: "ceph") pod "4696f658-d3e7-4aee-9569-80a393613cb9" (UID: "4696f658-d3e7-4aee-9569-80a393613cb9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.140137 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4696f658-d3e7-4aee-9569-80a393613cb9" (UID: "4696f658-d3e7-4aee-9569-80a393613cb9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.142622 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-inventory" (OuterVolumeSpecName: "inventory") pod "4696f658-d3e7-4aee-9569-80a393613cb9" (UID: "4696f658-d3e7-4aee-9569-80a393613cb9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.208981 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.209017 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.209028 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4696f658-d3e7-4aee-9569-80a393613cb9-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.209039 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfw2s\" (UniqueName: \"kubernetes.io/projected/4696f658-d3e7-4aee-9569-80a393613cb9-kube-api-access-pfw2s\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.571937 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" event={"ID":"4696f658-d3e7-4aee-9569-80a393613cb9","Type":"ContainerDied","Data":"1d18af2194c92a88ce183fcaecaa2f2127c6f5dd8d9b780fba70b99b1bf4b7e1"} Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.571988 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d18af2194c92a88ce183fcaecaa2f2127c6f5dd8d9b780fba70b99b1bf4b7e1" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.572053 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zhww5" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.713256 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6"] Dec 04 10:21:23 crc kubenswrapper[4776]: E1204 10:21:23.713736 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696f658-d3e7-4aee-9569-80a393613cb9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.713764 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696f658-d3e7-4aee-9569-80a393613cb9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.714067 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="4696f658-d3e7-4aee-9569-80a393613cb9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.714835 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.717023 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.717095 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.717494 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.717518 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.718026 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.724561 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6"] Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.820240 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.820305 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.820457 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vz6\" (UniqueName: \"kubernetes.io/projected/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-kube-api-access-x6vz6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.820615 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.923363 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vz6\" (UniqueName: \"kubernetes.io/projected/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-kube-api-access-x6vz6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.923629 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.923807 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.923907 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.927536 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.930241 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.939851 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:23 crc kubenswrapper[4776]: I1204 10:21:23.950013 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vz6\" (UniqueName: \"kubernetes.io/projected/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-kube-api-access-x6vz6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:24 crc kubenswrapper[4776]: I1204 10:21:24.048279 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:24 crc kubenswrapper[4776]: I1204 10:21:24.546814 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6"] Dec 04 10:21:24 crc kubenswrapper[4776]: I1204 10:21:24.550294 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:21:24 crc kubenswrapper[4776]: I1204 10:21:24.582040 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" event={"ID":"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660","Type":"ContainerStarted","Data":"559b83466bf4bd0b5c13a2309c01e4e86b322e6cc4a96d2af5b3d3a4e0148109"} Dec 04 10:21:25 crc kubenswrapper[4776]: I1204 10:21:25.621232 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" event={"ID":"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660","Type":"ContainerStarted","Data":"0b9fb266dfa231f8b56dbd5adbe27840bd815105160aa75a900a779c56677dc3"} Dec 04 10:21:25 crc kubenswrapper[4776]: I1204 10:21:25.654314 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" podStartSLOduration=2.2701694359999998 podStartE2EDuration="2.654290305s" podCreationTimestamp="2025-12-04 10:21:23 +0000 UTC" firstStartedPulling="2025-12-04 10:21:24.55008679 +0000 UTC m=+2529.416567167" lastFinishedPulling="2025-12-04 10:21:24.934207659 +0000 UTC m=+2529.800688036" observedRunningTime="2025-12-04 10:21:25.647259674 +0000 UTC m=+2530.513740051" watchObservedRunningTime="2025-12-04 10:21:25.654290305 +0000 UTC m=+2530.520770682" Dec 04 10:21:29 crc kubenswrapper[4776]: I1204 10:21:29.656655 4776 generic.go:334] "Generic (PLEG): container finished" podID="866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" containerID="0b9fb266dfa231f8b56dbd5adbe27840bd815105160aa75a900a779c56677dc3" exitCode=0 Dec 04 10:21:29 crc kubenswrapper[4776]: I1204 10:21:29.656885 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" event={"ID":"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660","Type":"ContainerDied","Data":"0b9fb266dfa231f8b56dbd5adbe27840bd815105160aa75a900a779c56677dc3"} Dec 04 10:21:30 crc kubenswrapper[4776]: I1204 10:21:30.453087 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.123106 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.306857 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ceph\") pod \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.307064 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ssh-key\") pod \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.307221 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6vz6\" (UniqueName: \"kubernetes.io/projected/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-kube-api-access-x6vz6\") pod \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.307330 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-inventory\") pod \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\" (UID: \"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660\") " Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.325806 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ceph" (OuterVolumeSpecName: "ceph") pod "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" (UID: "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.332444 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-kube-api-access-x6vz6" (OuterVolumeSpecName: "kube-api-access-x6vz6") pod "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" (UID: "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660"). InnerVolumeSpecName "kube-api-access-x6vz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.334045 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" (UID: "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.339137 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-inventory" (OuterVolumeSpecName: "inventory") pod "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" (UID: "866bd984-5d2f-4eb5-ad8e-e05f3e2d1660"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.411254 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.411285 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.411300 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6vz6\" (UniqueName: \"kubernetes.io/projected/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-kube-api-access-x6vz6\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.411314 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/866bd984-5d2f-4eb5-ad8e-e05f3e2d1660-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.675839 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"3b3694b7bd99eb3bb73428b2f0d0a80b952a000625fa6e6dc456d1e286636470"} Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.677907 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" event={"ID":"866bd984-5d2f-4eb5-ad8e-e05f3e2d1660","Type":"ContainerDied","Data":"559b83466bf4bd0b5c13a2309c01e4e86b322e6cc4a96d2af5b3d3a4e0148109"} Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.677954 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559b83466bf4bd0b5c13a2309c01e4e86b322e6cc4a96d2af5b3d3a4e0148109" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.677996 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.762904 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q"] Dec 04 10:21:31 crc kubenswrapper[4776]: E1204 10:21:31.764306 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.764336 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.769995 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="866bd984-5d2f-4eb5-ad8e-e05f3e2d1660" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.770788 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.774218 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.774473 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.774668 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.774792 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.774868 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.781688 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q"] Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.919931 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76bhw\" (UniqueName: \"kubernetes.io/projected/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-kube-api-access-76bhw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.919994 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.920140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:31 crc kubenswrapper[4776]: I1204 10:21:31.920454 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.024631 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.024776 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76bhw\" (UniqueName: \"kubernetes.io/projected/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-kube-api-access-76bhw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.024813 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.024892 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.031277 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.042475 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.045429 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.060440 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76bhw\" (UniqueName: \"kubernetes.io/projected/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-kube-api-access-76bhw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.097559 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.582399 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q"] Dec 04 10:21:32 crc kubenswrapper[4776]: W1204 10:21:32.585347 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfac53d89_7903_4d3b_abea_efbbe8f6a1b3.slice/crio-15b358ed075fb38727f58a684be10cb98956ecefebf10517eef914e137e29789 WatchSource:0}: Error finding container 15b358ed075fb38727f58a684be10cb98956ecefebf10517eef914e137e29789: Status 404 returned error can't find the container with id 15b358ed075fb38727f58a684be10cb98956ecefebf10517eef914e137e29789 Dec 04 10:21:32 crc kubenswrapper[4776]: I1204 10:21:32.686280 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" event={"ID":"fac53d89-7903-4d3b-abea-efbbe8f6a1b3","Type":"ContainerStarted","Data":"15b358ed075fb38727f58a684be10cb98956ecefebf10517eef914e137e29789"} Dec 04 10:21:33 crc kubenswrapper[4776]: I1204 10:21:33.695820 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" event={"ID":"fac53d89-7903-4d3b-abea-efbbe8f6a1b3","Type":"ContainerStarted","Data":"30dff3015c69dc6316eb7243ee42725db81c21547739e4bb58cd8d1f2e4804f7"} Dec 04 10:21:33 crc kubenswrapper[4776]: I1204 10:21:33.716723 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" podStartSLOduration=1.932628449 podStartE2EDuration="2.716692634s" podCreationTimestamp="2025-12-04 10:21:31 +0000 UTC" firstStartedPulling="2025-12-04 10:21:32.588120675 +0000 UTC m=+2537.454601052" lastFinishedPulling="2025-12-04 10:21:33.37218486 +0000 UTC m=+2538.238665237" observedRunningTime="2025-12-04 10:21:33.714475595 +0000 UTC m=+2538.580956002" watchObservedRunningTime="2025-12-04 10:21:33.716692634 +0000 UTC m=+2538.583173011" Dec 04 10:22:16 crc kubenswrapper[4776]: I1204 10:22:16.120281 4776 generic.go:334] "Generic (PLEG): container finished" podID="fac53d89-7903-4d3b-abea-efbbe8f6a1b3" containerID="30dff3015c69dc6316eb7243ee42725db81c21547739e4bb58cd8d1f2e4804f7" exitCode=0 Dec 04 10:22:16 crc kubenswrapper[4776]: I1204 10:22:16.120360 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" event={"ID":"fac53d89-7903-4d3b-abea-efbbe8f6a1b3","Type":"ContainerDied","Data":"30dff3015c69dc6316eb7243ee42725db81c21547739e4bb58cd8d1f2e4804f7"} Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.531045 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.622036 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-inventory\") pod \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.622378 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ceph\") pod \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.622484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76bhw\" (UniqueName: \"kubernetes.io/projected/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-kube-api-access-76bhw\") pod \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.622624 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ssh-key\") pod \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\" (UID: \"fac53d89-7903-4d3b-abea-efbbe8f6a1b3\") " Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.627954 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-kube-api-access-76bhw" (OuterVolumeSpecName: "kube-api-access-76bhw") pod "fac53d89-7903-4d3b-abea-efbbe8f6a1b3" (UID: "fac53d89-7903-4d3b-abea-efbbe8f6a1b3"). InnerVolumeSpecName "kube-api-access-76bhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.628213 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ceph" (OuterVolumeSpecName: "ceph") pod "fac53d89-7903-4d3b-abea-efbbe8f6a1b3" (UID: "fac53d89-7903-4d3b-abea-efbbe8f6a1b3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.650709 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fac53d89-7903-4d3b-abea-efbbe8f6a1b3" (UID: "fac53d89-7903-4d3b-abea-efbbe8f6a1b3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.652619 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-inventory" (OuterVolumeSpecName: "inventory") pod "fac53d89-7903-4d3b-abea-efbbe8f6a1b3" (UID: "fac53d89-7903-4d3b-abea-efbbe8f6a1b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.724653 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.724688 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.724698 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:17 crc kubenswrapper[4776]: I1204 10:22:17.724707 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76bhw\" (UniqueName: \"kubernetes.io/projected/fac53d89-7903-4d3b-abea-efbbe8f6a1b3-kube-api-access-76bhw\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.137796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" event={"ID":"fac53d89-7903-4d3b-abea-efbbe8f6a1b3","Type":"ContainerDied","Data":"15b358ed075fb38727f58a684be10cb98956ecefebf10517eef914e137e29789"} Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.137841 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15b358ed075fb38727f58a684be10cb98956ecefebf10517eef914e137e29789" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.137848 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.240161 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bk6g6"] Dec 04 10:22:18 crc kubenswrapper[4776]: E1204 10:22:18.240580 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac53d89-7903-4d3b-abea-efbbe8f6a1b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.240602 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac53d89-7903-4d3b-abea-efbbe8f6a1b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.240808 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac53d89-7903-4d3b-abea-efbbe8f6a1b3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.241471 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.247658 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.247856 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.247988 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.248094 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.251810 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.262880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bk6g6"] Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.335890 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq999\" (UniqueName: \"kubernetes.io/projected/5a1d46aa-2142-479a-9f26-2e8d24b69dca-kube-api-access-qq999\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.336005 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ceph\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.336050 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.336207 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.437480 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.437867 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.437930 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq999\" (UniqueName: \"kubernetes.io/projected/5a1d46aa-2142-479a-9f26-2e8d24b69dca-kube-api-access-qq999\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.437975 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ceph\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.442980 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.443468 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.443544 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ceph\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.456818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq999\" (UniqueName: \"kubernetes.io/projected/5a1d46aa-2142-479a-9f26-2e8d24b69dca-kube-api-access-qq999\") pod \"ssh-known-hosts-edpm-deployment-bk6g6\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:18 crc kubenswrapper[4776]: I1204 10:22:18.559839 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:19 crc kubenswrapper[4776]: I1204 10:22:19.169380 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bk6g6"] Dec 04 10:22:20 crc kubenswrapper[4776]: I1204 10:22:20.156510 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" event={"ID":"5a1d46aa-2142-479a-9f26-2e8d24b69dca","Type":"ContainerStarted","Data":"3e0c089c4c7973bbb912c06a6b2beb1168ce0a09e37504a49c71da943d5bcb56"} Dec 04 10:22:20 crc kubenswrapper[4776]: I1204 10:22:20.156838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" event={"ID":"5a1d46aa-2142-479a-9f26-2e8d24b69dca","Type":"ContainerStarted","Data":"8939538ce8286baf2eb1c000a531fbebadd5f1b3f2ff80c3f54ef1bb0be638c5"} Dec 04 10:22:20 crc kubenswrapper[4776]: I1204 10:22:20.175382 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" podStartSLOduration=1.739922622 podStartE2EDuration="2.175367383s" podCreationTimestamp="2025-12-04 10:22:18 +0000 UTC" firstStartedPulling="2025-12-04 10:22:19.182133491 +0000 UTC m=+2584.048613868" lastFinishedPulling="2025-12-04 10:22:19.617578252 +0000 UTC m=+2584.484058629" observedRunningTime="2025-12-04 10:22:20.175112565 +0000 UTC m=+2585.041592942" watchObservedRunningTime="2025-12-04 10:22:20.175367383 +0000 UTC m=+2585.041847760" Dec 04 10:22:29 crc kubenswrapper[4776]: I1204 10:22:29.237716 4776 generic.go:334] "Generic (PLEG): container finished" podID="5a1d46aa-2142-479a-9f26-2e8d24b69dca" containerID="3e0c089c4c7973bbb912c06a6b2beb1168ce0a09e37504a49c71da943d5bcb56" exitCode=0 Dec 04 10:22:29 crc kubenswrapper[4776]: I1204 10:22:29.237783 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" event={"ID":"5a1d46aa-2142-479a-9f26-2e8d24b69dca","Type":"ContainerDied","Data":"3e0c089c4c7973bbb912c06a6b2beb1168ce0a09e37504a49c71da943d5bcb56"} Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.661731 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.781004 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ssh-key-openstack-edpm-ipam\") pod \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.781079 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ceph\") pod \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.781232 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq999\" (UniqueName: \"kubernetes.io/projected/5a1d46aa-2142-479a-9f26-2e8d24b69dca-kube-api-access-qq999\") pod \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.781295 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-inventory-0\") pod \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\" (UID: \"5a1d46aa-2142-479a-9f26-2e8d24b69dca\") " Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.786446 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ceph" (OuterVolumeSpecName: "ceph") pod "5a1d46aa-2142-479a-9f26-2e8d24b69dca" (UID: "5a1d46aa-2142-479a-9f26-2e8d24b69dca"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.787265 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1d46aa-2142-479a-9f26-2e8d24b69dca-kube-api-access-qq999" (OuterVolumeSpecName: "kube-api-access-qq999") pod "5a1d46aa-2142-479a-9f26-2e8d24b69dca" (UID: "5a1d46aa-2142-479a-9f26-2e8d24b69dca"). InnerVolumeSpecName "kube-api-access-qq999". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.808002 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5a1d46aa-2142-479a-9f26-2e8d24b69dca" (UID: "5a1d46aa-2142-479a-9f26-2e8d24b69dca"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.812120 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a1d46aa-2142-479a-9f26-2e8d24b69dca" (UID: "5a1d46aa-2142-479a-9f26-2e8d24b69dca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.883115 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.883154 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq999\" (UniqueName: \"kubernetes.io/projected/5a1d46aa-2142-479a-9f26-2e8d24b69dca-kube-api-access-qq999\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.883168 4776 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:30 crc kubenswrapper[4776]: I1204 10:22:30.883178 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a1d46aa-2142-479a-9f26-2e8d24b69dca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.257317 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" event={"ID":"5a1d46aa-2142-479a-9f26-2e8d24b69dca","Type":"ContainerDied","Data":"8939538ce8286baf2eb1c000a531fbebadd5f1b3f2ff80c3f54ef1bb0be638c5"} Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.257359 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8939538ce8286baf2eb1c000a531fbebadd5f1b3f2ff80c3f54ef1bb0be638c5" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.257400 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bk6g6" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.325679 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x"] Dec 04 10:22:31 crc kubenswrapper[4776]: E1204 10:22:31.326335 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1d46aa-2142-479a-9f26-2e8d24b69dca" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.326352 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1d46aa-2142-479a-9f26-2e8d24b69dca" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.326551 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1d46aa-2142-479a-9f26-2e8d24b69dca" containerName="ssh-known-hosts-edpm-deployment" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.327264 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.329732 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.329772 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.329867 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.330339 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.336688 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x"] Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.338476 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.493753 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zngtk\" (UniqueName: \"kubernetes.io/projected/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-kube-api-access-zngtk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.493835 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.493860 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.494080 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.596365 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.596430 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.596580 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.596629 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zngtk\" (UniqueName: \"kubernetes.io/projected/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-kube-api-access-zngtk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.601651 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.601954 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.602566 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.624758 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zngtk\" (UniqueName: \"kubernetes.io/projected/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-kube-api-access-zngtk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z8x2x\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:31 crc kubenswrapper[4776]: I1204 10:22:31.645429 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:32 crc kubenswrapper[4776]: I1204 10:22:32.141206 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x"] Dec 04 10:22:32 crc kubenswrapper[4776]: I1204 10:22:32.266446 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" event={"ID":"6b269d5b-a372-4bb4-8e6c-558e97ce60cf","Type":"ContainerStarted","Data":"0d77cf9d14ca5f5ddecab40ca5b16ee1206a4955fbf9d67896a9b6d22ce145f4"} Dec 04 10:22:33 crc kubenswrapper[4776]: I1204 10:22:33.274358 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" event={"ID":"6b269d5b-a372-4bb4-8e6c-558e97ce60cf","Type":"ContainerStarted","Data":"e88abc9220dc1fb9662b67c494b4723524310c3a5f8bebe843c2d7041b94cb51"} Dec 04 10:22:33 crc kubenswrapper[4776]: I1204 10:22:33.294120 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" podStartSLOduration=1.823454887 podStartE2EDuration="2.294103862s" podCreationTimestamp="2025-12-04 10:22:31 +0000 UTC" firstStartedPulling="2025-12-04 10:22:32.145327607 +0000 UTC m=+2597.011807984" lastFinishedPulling="2025-12-04 10:22:32.615976582 +0000 UTC m=+2597.482456959" observedRunningTime="2025-12-04 10:22:33.289235988 +0000 UTC m=+2598.155716385" watchObservedRunningTime="2025-12-04 10:22:33.294103862 +0000 UTC m=+2598.160584239" Dec 04 10:22:40 crc kubenswrapper[4776]: I1204 10:22:40.332770 4776 generic.go:334] "Generic (PLEG): container finished" podID="6b269d5b-a372-4bb4-8e6c-558e97ce60cf" containerID="e88abc9220dc1fb9662b67c494b4723524310c3a5f8bebe843c2d7041b94cb51" exitCode=0 Dec 04 10:22:40 crc kubenswrapper[4776]: I1204 10:22:40.332870 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" event={"ID":"6b269d5b-a372-4bb4-8e6c-558e97ce60cf","Type":"ContainerDied","Data":"e88abc9220dc1fb9662b67c494b4723524310c3a5f8bebe843c2d7041b94cb51"} Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.714139 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.783703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ceph\") pod \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.783788 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-inventory\") pod \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.783851 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zngtk\" (UniqueName: \"kubernetes.io/projected/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-kube-api-access-zngtk\") pod \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.784007 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ssh-key\") pod \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\" (UID: \"6b269d5b-a372-4bb4-8e6c-558e97ce60cf\") " Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.790793 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-kube-api-access-zngtk" (OuterVolumeSpecName: "kube-api-access-zngtk") pod "6b269d5b-a372-4bb4-8e6c-558e97ce60cf" (UID: "6b269d5b-a372-4bb4-8e6c-558e97ce60cf"). InnerVolumeSpecName "kube-api-access-zngtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.792041 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ceph" (OuterVolumeSpecName: "ceph") pod "6b269d5b-a372-4bb4-8e6c-558e97ce60cf" (UID: "6b269d5b-a372-4bb4-8e6c-558e97ce60cf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.808783 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-inventory" (OuterVolumeSpecName: "inventory") pod "6b269d5b-a372-4bb4-8e6c-558e97ce60cf" (UID: "6b269d5b-a372-4bb4-8e6c-558e97ce60cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.808852 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b269d5b-a372-4bb4-8e6c-558e97ce60cf" (UID: "6b269d5b-a372-4bb4-8e6c-558e97ce60cf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.886197 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.886255 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.886272 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zngtk\" (UniqueName: \"kubernetes.io/projected/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-kube-api-access-zngtk\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:41 crc kubenswrapper[4776]: I1204 10:22:41.886284 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b269d5b-a372-4bb4-8e6c-558e97ce60cf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.351860 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" event={"ID":"6b269d5b-a372-4bb4-8e6c-558e97ce60cf","Type":"ContainerDied","Data":"0d77cf9d14ca5f5ddecab40ca5b16ee1206a4955fbf9d67896a9b6d22ce145f4"} Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.352127 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d77cf9d14ca5f5ddecab40ca5b16ee1206a4955fbf9d67896a9b6d22ce145f4" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.351966 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z8x2x" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.442478 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl"] Dec 04 10:22:42 crc kubenswrapper[4776]: E1204 10:22:42.443030 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b269d5b-a372-4bb4-8e6c-558e97ce60cf" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.443059 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b269d5b-a372-4bb4-8e6c-558e97ce60cf" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.443278 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b269d5b-a372-4bb4-8e6c-558e97ce60cf" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.444113 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.449457 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.449645 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.449925 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.450123 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.450144 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl"] Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.453320 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.496905 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.497012 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.497036 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.497185 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjw9\" (UniqueName: \"kubernetes.io/projected/9166f367-b1aa-46ad-945d-d1653c18a914-kube-api-access-2pjw9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.599650 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.599815 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.599856 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.599970 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjw9\" (UniqueName: \"kubernetes.io/projected/9166f367-b1aa-46ad-945d-d1653c18a914-kube-api-access-2pjw9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.609320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.609446 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.615057 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.615606 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjw9\" (UniqueName: \"kubernetes.io/projected/9166f367-b1aa-46ad-945d-d1653c18a914-kube-api-access-2pjw9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:42 crc kubenswrapper[4776]: I1204 10:22:42.768578 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:43 crc kubenswrapper[4776]: I1204 10:22:43.264316 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl"] Dec 04 10:22:43 crc kubenswrapper[4776]: I1204 10:22:43.359899 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" event={"ID":"9166f367-b1aa-46ad-945d-d1653c18a914","Type":"ContainerStarted","Data":"102688a107585601972708fcb5e7d6741df5123a6081a5269cab22c8aeaebd74"} Dec 04 10:22:44 crc kubenswrapper[4776]: I1204 10:22:44.370493 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" event={"ID":"9166f367-b1aa-46ad-945d-d1653c18a914","Type":"ContainerStarted","Data":"f96a0c88955aa066c6882db4502a3cdc108c4f09d2a874aa920acfd59f55d8fc"} Dec 04 10:22:44 crc kubenswrapper[4776]: I1204 10:22:44.397763 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" podStartSLOduration=1.9712824690000001 podStartE2EDuration="2.397743007s" podCreationTimestamp="2025-12-04 10:22:42 +0000 UTC" firstStartedPulling="2025-12-04 10:22:43.271735428 +0000 UTC m=+2608.138215805" lastFinishedPulling="2025-12-04 10:22:43.698195956 +0000 UTC m=+2608.564676343" observedRunningTime="2025-12-04 10:22:44.387884208 +0000 UTC m=+2609.254364585" watchObservedRunningTime="2025-12-04 10:22:44.397743007 +0000 UTC m=+2609.264223384" Dec 04 10:22:53 crc kubenswrapper[4776]: I1204 10:22:53.444825 4776 generic.go:334] "Generic (PLEG): container finished" podID="9166f367-b1aa-46ad-945d-d1653c18a914" containerID="f96a0c88955aa066c6882db4502a3cdc108c4f09d2a874aa920acfd59f55d8fc" exitCode=0 Dec 04 10:22:53 crc kubenswrapper[4776]: I1204 10:22:53.444931 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" event={"ID":"9166f367-b1aa-46ad-945d-d1653c18a914","Type":"ContainerDied","Data":"f96a0c88955aa066c6882db4502a3cdc108c4f09d2a874aa920acfd59f55d8fc"} Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.852225 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.915376 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ssh-key\") pod \"9166f367-b1aa-46ad-945d-d1653c18a914\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.915457 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pjw9\" (UniqueName: \"kubernetes.io/projected/9166f367-b1aa-46ad-945d-d1653c18a914-kube-api-access-2pjw9\") pod \"9166f367-b1aa-46ad-945d-d1653c18a914\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.915484 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-inventory\") pod \"9166f367-b1aa-46ad-945d-d1653c18a914\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.915528 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ceph\") pod \"9166f367-b1aa-46ad-945d-d1653c18a914\" (UID: \"9166f367-b1aa-46ad-945d-d1653c18a914\") " Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.920413 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9166f367-b1aa-46ad-945d-d1653c18a914-kube-api-access-2pjw9" (OuterVolumeSpecName: "kube-api-access-2pjw9") pod "9166f367-b1aa-46ad-945d-d1653c18a914" (UID: "9166f367-b1aa-46ad-945d-d1653c18a914"). InnerVolumeSpecName "kube-api-access-2pjw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.928152 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ceph" (OuterVolumeSpecName: "ceph") pod "9166f367-b1aa-46ad-945d-d1653c18a914" (UID: "9166f367-b1aa-46ad-945d-d1653c18a914"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.940272 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-inventory" (OuterVolumeSpecName: "inventory") pod "9166f367-b1aa-46ad-945d-d1653c18a914" (UID: "9166f367-b1aa-46ad-945d-d1653c18a914"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:54 crc kubenswrapper[4776]: I1204 10:22:54.945553 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9166f367-b1aa-46ad-945d-d1653c18a914" (UID: "9166f367-b1aa-46ad-945d-d1653c18a914"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.017306 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pjw9\" (UniqueName: \"kubernetes.io/projected/9166f367-b1aa-46ad-945d-d1653c18a914-kube-api-access-2pjw9\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.017353 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.017364 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.017375 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9166f367-b1aa-46ad-945d-d1653c18a914-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.465233 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.467321 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl" event={"ID":"9166f367-b1aa-46ad-945d-d1653c18a914","Type":"ContainerDied","Data":"102688a107585601972708fcb5e7d6741df5123a6081a5269cab22c8aeaebd74"} Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.467378 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102688a107585601972708fcb5e7d6741df5123a6081a5269cab22c8aeaebd74" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.554732 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw"] Dec 04 10:22:55 crc kubenswrapper[4776]: E1204 10:22:55.555160 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9166f367-b1aa-46ad-945d-d1653c18a914" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.555182 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="9166f367-b1aa-46ad-945d-d1653c18a914" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.555352 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="9166f367-b1aa-46ad-945d-d1653c18a914" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.555897 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.558512 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.558610 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.559464 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.561277 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.561474 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.561785 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.562056 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.569950 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.577169 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw"] Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729190 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729481 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729546 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729660 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729789 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729824 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729870 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729890 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729910 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngxnk\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-kube-api-access-ngxnk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.729942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.730035 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831408 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831464 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831486 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831515 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831536 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831554 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831596 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831634 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831657 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831697 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831715 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831733 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngxnk\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-kube-api-access-ngxnk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.831756 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.839236 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.839688 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.839858 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.839928 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.839896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.840002 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.840126 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.840345 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.840885 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.841396 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.842015 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.846902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.855968 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngxnk\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-kube-api-access-ngxnk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:55 crc kubenswrapper[4776]: I1204 10:22:55.876463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:22:56 crc kubenswrapper[4776]: I1204 10:22:56.450718 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw"] Dec 04 10:22:56 crc kubenswrapper[4776]: I1204 10:22:56.473035 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" event={"ID":"c2066384-4861-4b8b-8a26-ccdafaa3394d","Type":"ContainerStarted","Data":"e3db35aef6ed9ea2a83d5086f75ae9e983494eeb9015a283530383bef2eb7692"} Dec 04 10:22:57 crc kubenswrapper[4776]: I1204 10:22:57.483486 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" event={"ID":"c2066384-4861-4b8b-8a26-ccdafaa3394d","Type":"ContainerStarted","Data":"3e658304dc32120a886633b378d9658f910d00d47d36870d8ca7d0ea3f11fa66"} Dec 04 10:22:57 crc kubenswrapper[4776]: I1204 10:22:57.511797 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" podStartSLOduration=2.029902492 podStartE2EDuration="2.511770919s" podCreationTimestamp="2025-12-04 10:22:55 +0000 UTC" firstStartedPulling="2025-12-04 10:22:56.446526237 +0000 UTC m=+2621.313006614" lastFinishedPulling="2025-12-04 10:22:56.928394664 +0000 UTC m=+2621.794875041" observedRunningTime="2025-12-04 10:22:57.50446463 +0000 UTC m=+2622.370945027" watchObservedRunningTime="2025-12-04 10:22:57.511770919 +0000 UTC m=+2622.378251296" Dec 04 10:23:28 crc kubenswrapper[4776]: I1204 10:23:28.745468 4776 generic.go:334] "Generic (PLEG): container finished" podID="c2066384-4861-4b8b-8a26-ccdafaa3394d" containerID="3e658304dc32120a886633b378d9658f910d00d47d36870d8ca7d0ea3f11fa66" exitCode=0 Dec 04 10:23:28 crc kubenswrapper[4776]: I1204 10:23:28.745595 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" event={"ID":"c2066384-4861-4b8b-8a26-ccdafaa3394d","Type":"ContainerDied","Data":"3e658304dc32120a886633b378d9658f910d00d47d36870d8ca7d0ea3f11fa66"} Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.157715 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277553 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngxnk\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-kube-api-access-ngxnk\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ssh-key\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277705 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277745 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ovn-combined-ca-bundle\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277762 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-repo-setup-combined-ca-bundle\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277797 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-nova-combined-ca-bundle\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277816 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277859 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-neutron-metadata-combined-ca-bundle\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277895 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ceph\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.277940 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-inventory\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.278002 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.278054 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-bootstrap-combined-ca-bundle\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.278076 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-libvirt-combined-ca-bundle\") pod \"c2066384-4861-4b8b-8a26-ccdafaa3394d\" (UID: \"c2066384-4861-4b8b-8a26-ccdafaa3394d\") " Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.284532 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.284583 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.284788 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.285504 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.285682 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.285718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.286472 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-kube-api-access-ngxnk" (OuterVolumeSpecName: "kube-api-access-ngxnk") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "kube-api-access-ngxnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.289073 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.293057 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ceph" (OuterVolumeSpecName: "ceph") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.297467 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.298047 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.322566 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.322553 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-inventory" (OuterVolumeSpecName: "inventory") pod "c2066384-4861-4b8b-8a26-ccdafaa3394d" (UID: "c2066384-4861-4b8b-8a26-ccdafaa3394d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.379879 4776 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.379942 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.379958 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngxnk\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-kube-api-access-ngxnk\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.379973 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.379982 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.379994 4776 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380005 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380015 4776 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380025 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380034 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380043 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380051 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2066384-4861-4b8b-8a26-ccdafaa3394d-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.380060 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c2066384-4861-4b8b-8a26-ccdafaa3394d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.764470 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" event={"ID":"c2066384-4861-4b8b-8a26-ccdafaa3394d","Type":"ContainerDied","Data":"e3db35aef6ed9ea2a83d5086f75ae9e983494eeb9015a283530383bef2eb7692"} Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.764535 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3db35aef6ed9ea2a83d5086f75ae9e983494eeb9015a283530383bef2eb7692" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.764631 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.856653 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm"] Dec 04 10:23:30 crc kubenswrapper[4776]: E1204 10:23:30.857059 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2066384-4861-4b8b-8a26-ccdafaa3394d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.857081 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2066384-4861-4b8b-8a26-ccdafaa3394d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.857285 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2066384-4861-4b8b-8a26-ccdafaa3394d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.857903 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.860931 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.861252 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.861677 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.861821 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.867599 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.873590 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm"] Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.990230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.990985 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqn59\" (UniqueName: \"kubernetes.io/projected/dd3ec814-a647-4575-abd7-fbdec22fd54f-kube-api-access-lqn59\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.991121 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:30 crc kubenswrapper[4776]: I1204 10:23:30.991295 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.092493 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqn59\" (UniqueName: \"kubernetes.io/projected/dd3ec814-a647-4575-abd7-fbdec22fd54f-kube-api-access-lqn59\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.092541 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.092629 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.092666 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.096857 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.097939 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.098896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.111044 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqn59\" (UniqueName: \"kubernetes.io/projected/dd3ec814-a647-4575-abd7-fbdec22fd54f-kube-api-access-lqn59\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.181911 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.701610 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm"] Dec 04 10:23:31 crc kubenswrapper[4776]: I1204 10:23:31.774499 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" event={"ID":"dd3ec814-a647-4575-abd7-fbdec22fd54f","Type":"ContainerStarted","Data":"1919ac0dfbcf7f3798601b7d9796e53dcaf5d6d916aae83644437380abcbde2a"} Dec 04 10:23:32 crc kubenswrapper[4776]: I1204 10:23:32.782526 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" event={"ID":"dd3ec814-a647-4575-abd7-fbdec22fd54f","Type":"ContainerStarted","Data":"69f0363418c2af12d4307d86fab875096c0290bca8f2aa5cbd4f47e30aa6c661"} Dec 04 10:23:32 crc kubenswrapper[4776]: I1204 10:23:32.803604 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" podStartSLOduration=2.383293591 podStartE2EDuration="2.803584316s" podCreationTimestamp="2025-12-04 10:23:30 +0000 UTC" firstStartedPulling="2025-12-04 10:23:31.708375443 +0000 UTC m=+2656.574855830" lastFinishedPulling="2025-12-04 10:23:32.128666148 +0000 UTC m=+2656.995146555" observedRunningTime="2025-12-04 10:23:32.797127643 +0000 UTC m=+2657.663608040" watchObservedRunningTime="2025-12-04 10:23:32.803584316 +0000 UTC m=+2657.670064693" Dec 04 10:23:37 crc kubenswrapper[4776]: E1204 10:23:37.433004 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3ec814_a647_4575_abd7_fbdec22fd54f.slice/crio-conmon-69f0363418c2af12d4307d86fab875096c0290bca8f2aa5cbd4f47e30aa6c661.scope\": RecentStats: unable to find data in memory cache]" Dec 04 10:23:37 crc kubenswrapper[4776]: I1204 10:23:37.825063 4776 generic.go:334] "Generic (PLEG): container finished" podID="dd3ec814-a647-4575-abd7-fbdec22fd54f" containerID="69f0363418c2af12d4307d86fab875096c0290bca8f2aa5cbd4f47e30aa6c661" exitCode=0 Dec 04 10:23:37 crc kubenswrapper[4776]: I1204 10:23:37.825128 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" event={"ID":"dd3ec814-a647-4575-abd7-fbdec22fd54f","Type":"ContainerDied","Data":"69f0363418c2af12d4307d86fab875096c0290bca8f2aa5cbd4f47e30aa6c661"} Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.252040 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.353817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ssh-key\") pod \"dd3ec814-a647-4575-abd7-fbdec22fd54f\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.353958 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-inventory\") pod \"dd3ec814-a647-4575-abd7-fbdec22fd54f\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.354074 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ceph\") pod \"dd3ec814-a647-4575-abd7-fbdec22fd54f\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.354192 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqn59\" (UniqueName: \"kubernetes.io/projected/dd3ec814-a647-4575-abd7-fbdec22fd54f-kube-api-access-lqn59\") pod \"dd3ec814-a647-4575-abd7-fbdec22fd54f\" (UID: \"dd3ec814-a647-4575-abd7-fbdec22fd54f\") " Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.361136 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ceph" (OuterVolumeSpecName: "ceph") pod "dd3ec814-a647-4575-abd7-fbdec22fd54f" (UID: "dd3ec814-a647-4575-abd7-fbdec22fd54f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.361377 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3ec814-a647-4575-abd7-fbdec22fd54f-kube-api-access-lqn59" (OuterVolumeSpecName: "kube-api-access-lqn59") pod "dd3ec814-a647-4575-abd7-fbdec22fd54f" (UID: "dd3ec814-a647-4575-abd7-fbdec22fd54f"). InnerVolumeSpecName "kube-api-access-lqn59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.386168 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd3ec814-a647-4575-abd7-fbdec22fd54f" (UID: "dd3ec814-a647-4575-abd7-fbdec22fd54f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.387175 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-inventory" (OuterVolumeSpecName: "inventory") pod "dd3ec814-a647-4575-abd7-fbdec22fd54f" (UID: "dd3ec814-a647-4575-abd7-fbdec22fd54f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.456137 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.456186 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqn59\" (UniqueName: \"kubernetes.io/projected/dd3ec814-a647-4575-abd7-fbdec22fd54f-kube-api-access-lqn59\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.456204 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.456220 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd3ec814-a647-4575-abd7-fbdec22fd54f-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.841374 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" event={"ID":"dd3ec814-a647-4575-abd7-fbdec22fd54f","Type":"ContainerDied","Data":"1919ac0dfbcf7f3798601b7d9796e53dcaf5d6d916aae83644437380abcbde2a"} Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.841645 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1919ac0dfbcf7f3798601b7d9796e53dcaf5d6d916aae83644437380abcbde2a" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.841438 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.923497 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg"] Dec 04 10:23:39 crc kubenswrapper[4776]: E1204 10:23:39.923954 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3ec814-a647-4575-abd7-fbdec22fd54f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.923978 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3ec814-a647-4575-abd7-fbdec22fd54f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.924244 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3ec814-a647-4575-abd7-fbdec22fd54f" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.925094 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.928521 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.928698 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.928855 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.929840 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.929899 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.929903 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.932465 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg"] Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.963351 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.963426 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.963494 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.963555 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.963583 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:39 crc kubenswrapper[4776]: I1204 10:23:39.963637 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvx9\" (UniqueName: \"kubernetes.io/projected/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-kube-api-access-rzvx9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.065147 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.065199 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.065264 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvx9\" (UniqueName: \"kubernetes.io/projected/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-kube-api-access-rzvx9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.065319 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.065374 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.065449 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.066458 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.069707 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.070011 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.070551 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.074896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.084114 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvx9\" (UniqueName: \"kubernetes.io/projected/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-kube-api-access-rzvx9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fhnqg\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.242466 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.765411 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg"] Dec 04 10:23:40 crc kubenswrapper[4776]: I1204 10:23:40.850217 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" event={"ID":"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c","Type":"ContainerStarted","Data":"b9e2e0661aab62505d4919a05810c02b9305260c87387c637c02ce1a90f7aa8d"} Dec 04 10:23:41 crc kubenswrapper[4776]: I1204 10:23:41.859618 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" event={"ID":"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c","Type":"ContainerStarted","Data":"b9a511bc6d626a0e6a0105fd09114af115526b60e9b6ff7d16948ef9cc19dd01"} Dec 04 10:23:41 crc kubenswrapper[4776]: I1204 10:23:41.882019 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" podStartSLOduration=2.3807605499999998 podStartE2EDuration="2.88199454s" podCreationTimestamp="2025-12-04 10:23:39 +0000 UTC" firstStartedPulling="2025-12-04 10:23:40.767194903 +0000 UTC m=+2665.633675280" lastFinishedPulling="2025-12-04 10:23:41.268428893 +0000 UTC m=+2666.134909270" observedRunningTime="2025-12-04 10:23:41.873754812 +0000 UTC m=+2666.740235189" watchObservedRunningTime="2025-12-04 10:23:41.88199454 +0000 UTC m=+2666.748474917" Dec 04 10:23:49 crc kubenswrapper[4776]: I1204 10:23:49.380455 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:23:49 crc kubenswrapper[4776]: I1204 10:23:49.382133 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:24:19 crc kubenswrapper[4776]: I1204 10:24:19.380122 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:24:19 crc kubenswrapper[4776]: I1204 10:24:19.381105 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.380054 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.381780 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.381930 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.382780 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b3694b7bd99eb3bb73428b2f0d0a80b952a000625fa6e6dc456d1e286636470"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.382934 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://3b3694b7bd99eb3bb73428b2f0d0a80b952a000625fa6e6dc456d1e286636470" gracePeriod=600 Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.815407 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="3b3694b7bd99eb3bb73428b2f0d0a80b952a000625fa6e6dc456d1e286636470" exitCode=0 Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.815487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"3b3694b7bd99eb3bb73428b2f0d0a80b952a000625fa6e6dc456d1e286636470"} Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.815745 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0"} Dec 04 10:24:49 crc kubenswrapper[4776]: I1204 10:24:49.815770 4776 scope.go:117] "RemoveContainer" containerID="4294f0e1b80131017971f7061ec77f214624de7298d822d819247e8a4053b572" Dec 04 10:24:51 crc kubenswrapper[4776]: I1204 10:24:51.837142 4776 generic.go:334] "Generic (PLEG): container finished" podID="a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" containerID="b9a511bc6d626a0e6a0105fd09114af115526b60e9b6ff7d16948ef9cc19dd01" exitCode=0 Dec 04 10:24:51 crc kubenswrapper[4776]: I1204 10:24:51.837225 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" event={"ID":"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c","Type":"ContainerDied","Data":"b9a511bc6d626a0e6a0105fd09114af115526b60e9b6ff7d16948ef9cc19dd01"} Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.248337 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.432593 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ssh-key\") pod \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.432665 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovn-combined-ca-bundle\") pod \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.432784 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzvx9\" (UniqueName: \"kubernetes.io/projected/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-kube-api-access-rzvx9\") pod \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.432833 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ceph\") pod \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.432878 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-inventory\") pod \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.433104 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovncontroller-config-0\") pod \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\" (UID: \"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c\") " Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.440213 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ceph" (OuterVolumeSpecName: "ceph") pod "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" (UID: "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.440286 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-kube-api-access-rzvx9" (OuterVolumeSpecName: "kube-api-access-rzvx9") pod "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" (UID: "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c"). InnerVolumeSpecName "kube-api-access-rzvx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.446199 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" (UID: "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.460226 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" (UID: "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.461130 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" (UID: "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.477311 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-inventory" (OuterVolumeSpecName: "inventory") pod "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" (UID: "a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.536203 4776 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.536240 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.536257 4776 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.536269 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzvx9\" (UniqueName: \"kubernetes.io/projected/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-kube-api-access-rzvx9\") on node \"crc\" DevicePath \"\"" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.536305 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.536341 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.851589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" event={"ID":"a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c","Type":"ContainerDied","Data":"b9e2e0661aab62505d4919a05810c02b9305260c87387c637c02ce1a90f7aa8d"} Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.851626 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9e2e0661aab62505d4919a05810c02b9305260c87387c637c02ce1a90f7aa8d" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.851661 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fhnqg" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.957398 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn"] Dec 04 10:24:53 crc kubenswrapper[4776]: E1204 10:24:53.957819 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.957844 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.958046 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.958662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.962815 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.962883 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.962828 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.963276 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.963369 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.963478 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.963502 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 04 10:24:53 crc kubenswrapper[4776]: I1204 10:24:53.979564 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn"] Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.146655 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.147010 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg7v6\" (UniqueName: \"kubernetes.io/projected/c3288d5d-8705-4058-ac67-ef3c5e0e0359-kube-api-access-tg7v6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.147050 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.147144 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.147281 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.147388 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.147519 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.249608 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.249713 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.249775 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.249846 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.249875 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg7v6\" (UniqueName: \"kubernetes.io/projected/c3288d5d-8705-4058-ac67-ef3c5e0e0359-kube-api-access-tg7v6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.249936 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.250043 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.259646 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.259673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.259731 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.259796 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.260215 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.260391 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.270134 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg7v6\" (UniqueName: \"kubernetes.io/projected/c3288d5d-8705-4058-ac67-ef3c5e0e0359-kube-api-access-tg7v6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.291356 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.833161 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn"] Dec 04 10:24:54 crc kubenswrapper[4776]: I1204 10:24:54.863249 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" event={"ID":"c3288d5d-8705-4058-ac67-ef3c5e0e0359","Type":"ContainerStarted","Data":"397747c1071e705ad23be1ca5345cd97137eed2da4cdeb43067db9a57617b73d"} Dec 04 10:24:55 crc kubenswrapper[4776]: I1204 10:24:55.873427 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" event={"ID":"c3288d5d-8705-4058-ac67-ef3c5e0e0359","Type":"ContainerStarted","Data":"2acfb3c5ad9aed478b6d1a4dd4dee6396cc91f9e9037471576145268b2094b70"} Dec 04 10:25:54 crc kubenswrapper[4776]: I1204 10:25:54.358198 4776 generic.go:334] "Generic (PLEG): container finished" podID="c3288d5d-8705-4058-ac67-ef3c5e0e0359" containerID="2acfb3c5ad9aed478b6d1a4dd4dee6396cc91f9e9037471576145268b2094b70" exitCode=0 Dec 04 10:25:54 crc kubenswrapper[4776]: I1204 10:25:54.358254 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" event={"ID":"c3288d5d-8705-4058-ac67-ef3c5e0e0359","Type":"ContainerDied","Data":"2acfb3c5ad9aed478b6d1a4dd4dee6396cc91f9e9037471576145268b2094b70"} Dec 04 10:25:55 crc kubenswrapper[4776]: I1204 10:25:55.831936 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.025775 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-metadata-combined-ca-bundle\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.026104 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg7v6\" (UniqueName: \"kubernetes.io/projected/c3288d5d-8705-4058-ac67-ef3c5e0e0359-kube-api-access-tg7v6\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.026139 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ceph\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.026763 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ssh-key\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.026836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-inventory\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.026888 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-nova-metadata-neutron-config-0\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.026928 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\" (UID: \"c3288d5d-8705-4058-ac67-ef3c5e0e0359\") " Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.032071 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ceph" (OuterVolumeSpecName: "ceph") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.032170 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.046532 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3288d5d-8705-4058-ac67-ef3c5e0e0359-kube-api-access-tg7v6" (OuterVolumeSpecName: "kube-api-access-tg7v6") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "kube-api-access-tg7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.057667 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.059496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-inventory" (OuterVolumeSpecName: "inventory") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.060692 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.060844 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c3288d5d-8705-4058-ac67-ef3c5e0e0359" (UID: "c3288d5d-8705-4058-ac67-ef3c5e0e0359"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129835 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129907 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129932 4776 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129949 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129963 4776 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129977 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg7v6\" (UniqueName: \"kubernetes.io/projected/c3288d5d-8705-4058-ac67-ef3c5e0e0359-kube-api-access-tg7v6\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.129988 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3288d5d-8705-4058-ac67-ef3c5e0e0359-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.403864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" event={"ID":"c3288d5d-8705-4058-ac67-ef3c5e0e0359","Type":"ContainerDied","Data":"397747c1071e705ad23be1ca5345cd97137eed2da4cdeb43067db9a57617b73d"} Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.403900 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="397747c1071e705ad23be1ca5345cd97137eed2da4cdeb43067db9a57617b73d" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.403955 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.515501 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq"] Dec 04 10:25:56 crc kubenswrapper[4776]: E1204 10:25:56.515851 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3288d5d-8705-4058-ac67-ef3c5e0e0359" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.515890 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3288d5d-8705-4058-ac67-ef3c5e0e0359" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.516114 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3288d5d-8705-4058-ac67-ef3c5e0e0359" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.516666 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.519091 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.519103 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.520688 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.521112 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.521558 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.521731 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.523991 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq"] Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.637531 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.637671 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.637781 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.638961 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5s4\" (UniqueName: \"kubernetes.io/projected/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-kube-api-access-bj5s4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.639042 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.639139 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.741859 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5s4\" (UniqueName: \"kubernetes.io/projected/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-kube-api-access-bj5s4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.741940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.741991 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.742049 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.742081 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.742131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.747624 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.747673 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.748412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.750653 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.751372 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.762018 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5s4\" (UniqueName: \"kubernetes.io/projected/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-kube-api-access-bj5s4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:56 crc kubenswrapper[4776]: I1204 10:25:56.835497 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:25:57 crc kubenswrapper[4776]: I1204 10:25:57.342190 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq"] Dec 04 10:25:57 crc kubenswrapper[4776]: I1204 10:25:57.413291 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" event={"ID":"c95fc34d-f4d9-45d9-acf3-a4fb114a972e","Type":"ContainerStarted","Data":"21f515e87c74c15096ee1c919ddca8e60d046f193a1d8ecc3be1b23a0cdea3f7"} Dec 04 10:25:58 crc kubenswrapper[4776]: I1204 10:25:58.422288 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" event={"ID":"c95fc34d-f4d9-45d9-acf3-a4fb114a972e","Type":"ContainerStarted","Data":"981eec68edd56d698414e8b1726d95e1632167f0aafba3f23a48bde0afa95a6e"} Dec 04 10:25:58 crc kubenswrapper[4776]: I1204 10:25:58.440650 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" podStartSLOduration=1.993645764 podStartE2EDuration="2.44063241s" podCreationTimestamp="2025-12-04 10:25:56 +0000 UTC" firstStartedPulling="2025-12-04 10:25:57.348184084 +0000 UTC m=+2802.214664471" lastFinishedPulling="2025-12-04 10:25:57.79517074 +0000 UTC m=+2802.661651117" observedRunningTime="2025-12-04 10:25:58.436790318 +0000 UTC m=+2803.303270695" watchObservedRunningTime="2025-12-04 10:25:58.44063241 +0000 UTC m=+2803.307112787" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.010793 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7njtx"] Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.012716 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.024548 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7njtx"] Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.182948 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-utilities\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.183163 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-catalog-content\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.183354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kq4w\" (UniqueName: \"kubernetes.io/projected/61f247a7-d023-4081-bb17-406c55161ea0-kube-api-access-9kq4w\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.285319 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kq4w\" (UniqueName: \"kubernetes.io/projected/61f247a7-d023-4081-bb17-406c55161ea0-kube-api-access-9kq4w\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.285802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-utilities\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.285902 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-catalog-content\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.286359 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-utilities\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.286490 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-catalog-content\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.305716 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kq4w\" (UniqueName: \"kubernetes.io/projected/61f247a7-d023-4081-bb17-406c55161ea0-kube-api-access-9kq4w\") pod \"community-operators-7njtx\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.343803 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:25:59 crc kubenswrapper[4776]: I1204 10:25:59.887151 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7njtx"] Dec 04 10:26:00 crc kubenswrapper[4776]: I1204 10:26:00.444956 4776 generic.go:334] "Generic (PLEG): container finished" podID="61f247a7-d023-4081-bb17-406c55161ea0" containerID="894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca" exitCode=0 Dec 04 10:26:00 crc kubenswrapper[4776]: I1204 10:26:00.445101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7njtx" event={"ID":"61f247a7-d023-4081-bb17-406c55161ea0","Type":"ContainerDied","Data":"894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca"} Dec 04 10:26:00 crc kubenswrapper[4776]: I1204 10:26:00.445352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7njtx" event={"ID":"61f247a7-d023-4081-bb17-406c55161ea0","Type":"ContainerStarted","Data":"0f5aff53be31442b3dea024d88af7545d71a3c06c21bf59910877d084f6b5ed7"} Dec 04 10:26:02 crc kubenswrapper[4776]: I1204 10:26:02.467583 4776 generic.go:334] "Generic (PLEG): container finished" podID="61f247a7-d023-4081-bb17-406c55161ea0" containerID="91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04" exitCode=0 Dec 04 10:26:02 crc kubenswrapper[4776]: I1204 10:26:02.468579 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7njtx" event={"ID":"61f247a7-d023-4081-bb17-406c55161ea0","Type":"ContainerDied","Data":"91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04"} Dec 04 10:26:03 crc kubenswrapper[4776]: I1204 10:26:03.477699 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7njtx" event={"ID":"61f247a7-d023-4081-bb17-406c55161ea0","Type":"ContainerStarted","Data":"a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e"} Dec 04 10:26:03 crc kubenswrapper[4776]: I1204 10:26:03.495130 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7njtx" podStartSLOduration=3.009111976 podStartE2EDuration="5.495110272s" podCreationTimestamp="2025-12-04 10:25:58 +0000 UTC" firstStartedPulling="2025-12-04 10:26:00.447885282 +0000 UTC m=+2805.314365659" lastFinishedPulling="2025-12-04 10:26:02.933883578 +0000 UTC m=+2807.800363955" observedRunningTime="2025-12-04 10:26:03.492011784 +0000 UTC m=+2808.358492191" watchObservedRunningTime="2025-12-04 10:26:03.495110272 +0000 UTC m=+2808.361590649" Dec 04 10:26:09 crc kubenswrapper[4776]: I1204 10:26:09.344324 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:26:09 crc kubenswrapper[4776]: I1204 10:26:09.344762 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:26:09 crc kubenswrapper[4776]: I1204 10:26:09.397639 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:26:09 crc kubenswrapper[4776]: I1204 10:26:09.568832 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:26:09 crc kubenswrapper[4776]: I1204 10:26:09.631216 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7njtx"] Dec 04 10:26:11 crc kubenswrapper[4776]: I1204 10:26:11.541810 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7njtx" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="registry-server" containerID="cri-o://a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e" gracePeriod=2 Dec 04 10:26:11 crc kubenswrapper[4776]: I1204 10:26:11.989953 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.022252 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-utilities\") pod \"61f247a7-d023-4081-bb17-406c55161ea0\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.022430 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-catalog-content\") pod \"61f247a7-d023-4081-bb17-406c55161ea0\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.022501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kq4w\" (UniqueName: \"kubernetes.io/projected/61f247a7-d023-4081-bb17-406c55161ea0-kube-api-access-9kq4w\") pod \"61f247a7-d023-4081-bb17-406c55161ea0\" (UID: \"61f247a7-d023-4081-bb17-406c55161ea0\") " Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.023596 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-utilities" (OuterVolumeSpecName: "utilities") pod "61f247a7-d023-4081-bb17-406c55161ea0" (UID: "61f247a7-d023-4081-bb17-406c55161ea0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.040448 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f247a7-d023-4081-bb17-406c55161ea0-kube-api-access-9kq4w" (OuterVolumeSpecName: "kube-api-access-9kq4w") pod "61f247a7-d023-4081-bb17-406c55161ea0" (UID: "61f247a7-d023-4081-bb17-406c55161ea0"). InnerVolumeSpecName "kube-api-access-9kq4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.074369 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61f247a7-d023-4081-bb17-406c55161ea0" (UID: "61f247a7-d023-4081-bb17-406c55161ea0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.124325 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.124358 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f247a7-d023-4081-bb17-406c55161ea0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.124388 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kq4w\" (UniqueName: \"kubernetes.io/projected/61f247a7-d023-4081-bb17-406c55161ea0-kube-api-access-9kq4w\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.551673 4776 generic.go:334] "Generic (PLEG): container finished" podID="61f247a7-d023-4081-bb17-406c55161ea0" containerID="a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e" exitCode=0 Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.551713 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7njtx" event={"ID":"61f247a7-d023-4081-bb17-406c55161ea0","Type":"ContainerDied","Data":"a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e"} Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.551739 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7njtx" event={"ID":"61f247a7-d023-4081-bb17-406c55161ea0","Type":"ContainerDied","Data":"0f5aff53be31442b3dea024d88af7545d71a3c06c21bf59910877d084f6b5ed7"} Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.551757 4776 scope.go:117] "RemoveContainer" containerID="a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.552417 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7njtx" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.573131 4776 scope.go:117] "RemoveContainer" containerID="91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.600997 4776 scope.go:117] "RemoveContainer" containerID="894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.606225 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7njtx"] Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.615142 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7njtx"] Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.661044 4776 scope.go:117] "RemoveContainer" containerID="a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e" Dec 04 10:26:12 crc kubenswrapper[4776]: E1204 10:26:12.661436 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e\": container with ID starting with a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e not found: ID does not exist" containerID="a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.661468 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e"} err="failed to get container status \"a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e\": rpc error: code = NotFound desc = could not find container \"a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e\": container with ID starting with a999dcf5b0238d0d97dca31d17d7146c4a61f90c2cb4267ebf30b0002013998e not found: ID does not exist" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.661489 4776 scope.go:117] "RemoveContainer" containerID="91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04" Dec 04 10:26:12 crc kubenswrapper[4776]: E1204 10:26:12.661851 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04\": container with ID starting with 91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04 not found: ID does not exist" containerID="91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.661874 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04"} err="failed to get container status \"91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04\": rpc error: code = NotFound desc = could not find container \"91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04\": container with ID starting with 91dfe2e2ebead9d8c0666a9c87dce316bff66184843b7d9ec341309ba86ebc04 not found: ID does not exist" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.661888 4776 scope.go:117] "RemoveContainer" containerID="894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca" Dec 04 10:26:12 crc kubenswrapper[4776]: E1204 10:26:12.662150 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca\": container with ID starting with 894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca not found: ID does not exist" containerID="894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca" Dec 04 10:26:12 crc kubenswrapper[4776]: I1204 10:26:12.662170 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca"} err="failed to get container status \"894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca\": rpc error: code = NotFound desc = could not find container \"894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca\": container with ID starting with 894a695ac2cd1f3c7625a3a670c4d395fc933564662cd3b6fc4022b36c3dbbca not found: ID does not exist" Dec 04 10:26:13 crc kubenswrapper[4776]: I1204 10:26:13.463835 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f247a7-d023-4081-bb17-406c55161ea0" path="/var/lib/kubelet/pods/61f247a7-d023-4081-bb17-406c55161ea0/volumes" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.739652 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sdn9s"] Dec 04 10:26:43 crc kubenswrapper[4776]: E1204 10:26:43.740578 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="registry-server" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.740592 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="registry-server" Dec 04 10:26:43 crc kubenswrapper[4776]: E1204 10:26:43.740633 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="extract-utilities" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.740642 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="extract-utilities" Dec 04 10:26:43 crc kubenswrapper[4776]: E1204 10:26:43.740654 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="extract-content" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.740664 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="extract-content" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.740874 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f247a7-d023-4081-bb17-406c55161ea0" containerName="registry-server" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.742302 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.756630 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdn9s"] Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.815385 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-catalog-content\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.815542 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgh9j\" (UniqueName: \"kubernetes.io/projected/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-kube-api-access-qgh9j\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.815585 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-utilities\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.917164 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-catalog-content\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.917295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgh9j\" (UniqueName: \"kubernetes.io/projected/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-kube-api-access-qgh9j\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.917339 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-utilities\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.917870 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-utilities\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.917945 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-catalog-content\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:43 crc kubenswrapper[4776]: I1204 10:26:43.948236 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgh9j\" (UniqueName: \"kubernetes.io/projected/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-kube-api-access-qgh9j\") pod \"redhat-marketplace-sdn9s\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:44 crc kubenswrapper[4776]: I1204 10:26:44.066715 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:44 crc kubenswrapper[4776]: I1204 10:26:44.542666 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdn9s"] Dec 04 10:26:44 crc kubenswrapper[4776]: I1204 10:26:44.630494 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerStarted","Data":"16240a79751fca774c133d45826ed3b45aeeba2c7fd1029c958d2ba2b7c56662"} Dec 04 10:26:45 crc kubenswrapper[4776]: I1204 10:26:45.640946 4776 generic.go:334] "Generic (PLEG): container finished" podID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerID="2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734" exitCode=0 Dec 04 10:26:45 crc kubenswrapper[4776]: I1204 10:26:45.641169 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerDied","Data":"2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734"} Dec 04 10:26:45 crc kubenswrapper[4776]: I1204 10:26:45.642878 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:26:46 crc kubenswrapper[4776]: I1204 10:26:46.652283 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerStarted","Data":"47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2"} Dec 04 10:26:47 crc kubenswrapper[4776]: I1204 10:26:47.674251 4776 generic.go:334] "Generic (PLEG): container finished" podID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerID="47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2" exitCode=0 Dec 04 10:26:47 crc kubenswrapper[4776]: I1204 10:26:47.674560 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerDied","Data":"47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2"} Dec 04 10:26:47 crc kubenswrapper[4776]: I1204 10:26:47.674586 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerStarted","Data":"4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294"} Dec 04 10:26:47 crc kubenswrapper[4776]: I1204 10:26:47.705123 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sdn9s" podStartSLOduration=3.144099151 podStartE2EDuration="4.70510356s" podCreationTimestamp="2025-12-04 10:26:43 +0000 UTC" firstStartedPulling="2025-12-04 10:26:45.642624503 +0000 UTC m=+2850.509104900" lastFinishedPulling="2025-12-04 10:26:47.203628932 +0000 UTC m=+2852.070109309" observedRunningTime="2025-12-04 10:26:47.703645413 +0000 UTC m=+2852.570125800" watchObservedRunningTime="2025-12-04 10:26:47.70510356 +0000 UTC m=+2852.571583947" Dec 04 10:26:49 crc kubenswrapper[4776]: I1204 10:26:49.380365 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:26:49 crc kubenswrapper[4776]: I1204 10:26:49.380808 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:26:54 crc kubenswrapper[4776]: I1204 10:26:54.067600 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:54 crc kubenswrapper[4776]: I1204 10:26:54.068180 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:54 crc kubenswrapper[4776]: I1204 10:26:54.113305 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:54 crc kubenswrapper[4776]: I1204 10:26:54.780126 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:55 crc kubenswrapper[4776]: I1204 10:26:55.324321 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdn9s"] Dec 04 10:26:56 crc kubenswrapper[4776]: I1204 10:26:56.756169 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sdn9s" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="registry-server" containerID="cri-o://4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294" gracePeriod=2 Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.277519 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.343082 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgh9j\" (UniqueName: \"kubernetes.io/projected/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-kube-api-access-qgh9j\") pod \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.343152 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-utilities\") pod \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.343233 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-catalog-content\") pod \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\" (UID: \"0dff1626-4cc8-454a-a39c-6c281cc4c8f3\") " Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.344345 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-utilities" (OuterVolumeSpecName: "utilities") pod "0dff1626-4cc8-454a-a39c-6c281cc4c8f3" (UID: "0dff1626-4cc8-454a-a39c-6c281cc4c8f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.349361 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-kube-api-access-qgh9j" (OuterVolumeSpecName: "kube-api-access-qgh9j") pod "0dff1626-4cc8-454a-a39c-6c281cc4c8f3" (UID: "0dff1626-4cc8-454a-a39c-6c281cc4c8f3"). InnerVolumeSpecName "kube-api-access-qgh9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.360057 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dff1626-4cc8-454a-a39c-6c281cc4c8f3" (UID: "0dff1626-4cc8-454a-a39c-6c281cc4c8f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.445564 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgh9j\" (UniqueName: \"kubernetes.io/projected/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-kube-api-access-qgh9j\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.445611 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.445797 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dff1626-4cc8-454a-a39c-6c281cc4c8f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.771276 4776 generic.go:334] "Generic (PLEG): container finished" podID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerID="4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294" exitCode=0 Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.771333 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerDied","Data":"4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294"} Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.771405 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdn9s" event={"ID":"0dff1626-4cc8-454a-a39c-6c281cc4c8f3","Type":"ContainerDied","Data":"16240a79751fca774c133d45826ed3b45aeeba2c7fd1029c958d2ba2b7c56662"} Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.771436 4776 scope.go:117] "RemoveContainer" containerID="4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.771351 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdn9s" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.810425 4776 scope.go:117] "RemoveContainer" containerID="47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.819763 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdn9s"] Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.831074 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdn9s"] Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.832893 4776 scope.go:117] "RemoveContainer" containerID="2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.876192 4776 scope.go:117] "RemoveContainer" containerID="4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294" Dec 04 10:26:57 crc kubenswrapper[4776]: E1204 10:26:57.876801 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294\": container with ID starting with 4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294 not found: ID does not exist" containerID="4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.876838 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294"} err="failed to get container status \"4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294\": rpc error: code = NotFound desc = could not find container \"4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294\": container with ID starting with 4b53e19d16afeb15b5ef92d5b9a00d2227524f1c4187c94e7d6c4cca835d1294 not found: ID does not exist" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.876862 4776 scope.go:117] "RemoveContainer" containerID="47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2" Dec 04 10:26:57 crc kubenswrapper[4776]: E1204 10:26:57.877585 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2\": container with ID starting with 47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2 not found: ID does not exist" containerID="47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.877626 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2"} err="failed to get container status \"47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2\": rpc error: code = NotFound desc = could not find container \"47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2\": container with ID starting with 47cf6a07896b263a1d158a6ac7d60bd9637813aa623beb81375e263fb8f038c2 not found: ID does not exist" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.877646 4776 scope.go:117] "RemoveContainer" containerID="2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734" Dec 04 10:26:57 crc kubenswrapper[4776]: E1204 10:26:57.878076 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734\": container with ID starting with 2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734 not found: ID does not exist" containerID="2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734" Dec 04 10:26:57 crc kubenswrapper[4776]: I1204 10:26:57.878127 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734"} err="failed to get container status \"2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734\": rpc error: code = NotFound desc = could not find container \"2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734\": container with ID starting with 2917856554ae5737d793e68be1e604310632258f5efda31954b9c4668fa9d734 not found: ID does not exist" Dec 04 10:26:59 crc kubenswrapper[4776]: I1204 10:26:59.463801 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" path="/var/lib/kubelet/pods/0dff1626-4cc8-454a-a39c-6c281cc4c8f3/volumes" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.961234 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kj8z2"] Dec 04 10:27:12 crc kubenswrapper[4776]: E1204 10:27:12.962727 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="registry-server" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.962765 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="registry-server" Dec 04 10:27:12 crc kubenswrapper[4776]: E1204 10:27:12.962779 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="extract-content" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.962787 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="extract-content" Dec 04 10:27:12 crc kubenswrapper[4776]: E1204 10:27:12.962841 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="extract-utilities" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.962851 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="extract-utilities" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.963144 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dff1626-4cc8-454a-a39c-6c281cc4c8f3" containerName="registry-server" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.964862 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:12 crc kubenswrapper[4776]: I1204 10:27:12.969423 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kj8z2"] Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.128994 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-utilities\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.129242 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-catalog-content\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.129338 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4vp\" (UniqueName: \"kubernetes.io/projected/3cdce026-dfea-4328-ac53-ae69073d630e-kube-api-access-7b4vp\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.231079 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4vp\" (UniqueName: \"kubernetes.io/projected/3cdce026-dfea-4328-ac53-ae69073d630e-kube-api-access-7b4vp\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.231207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-utilities\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.231254 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-catalog-content\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.231772 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-utilities\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.231844 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-catalog-content\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.250806 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4vp\" (UniqueName: \"kubernetes.io/projected/3cdce026-dfea-4328-ac53-ae69073d630e-kube-api-access-7b4vp\") pod \"redhat-operators-kj8z2\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.300125 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.760660 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b27jj"] Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.762909 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.776442 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b27jj"] Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.827678 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kj8z2"] Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.843344 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw668\" (UniqueName: \"kubernetes.io/projected/111a5322-ba8a-4efa-b746-a1827418badd-kube-api-access-dw668\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.843391 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-utilities\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.843476 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-catalog-content\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.917145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerStarted","Data":"a9210f89e8bc002ac745a81b81ccb51d776b04a213e7364df4a6466177d8c1a5"} Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.944824 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-catalog-content\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.944948 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw668\" (UniqueName: \"kubernetes.io/projected/111a5322-ba8a-4efa-b746-a1827418badd-kube-api-access-dw668\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.944974 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-utilities\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.945427 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-utilities\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.945706 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-catalog-content\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:13 crc kubenswrapper[4776]: I1204 10:27:13.974800 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw668\" (UniqueName: \"kubernetes.io/projected/111a5322-ba8a-4efa-b746-a1827418badd-kube-api-access-dw668\") pod \"certified-operators-b27jj\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.098319 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.687728 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b27jj"] Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.944664 4776 generic.go:334] "Generic (PLEG): container finished" podID="111a5322-ba8a-4efa-b746-a1827418badd" containerID="fe9a912fd12fb932fe97db3e4d33a298c97c7560365d472aef31c922e76da394" exitCode=0 Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.944761 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b27jj" event={"ID":"111a5322-ba8a-4efa-b746-a1827418badd","Type":"ContainerDied","Data":"fe9a912fd12fb932fe97db3e4d33a298c97c7560365d472aef31c922e76da394"} Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.944788 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b27jj" event={"ID":"111a5322-ba8a-4efa-b746-a1827418badd","Type":"ContainerStarted","Data":"4e1bc2cb4d2d4bbbb689db6940c5940f51be59acb8b19094c29157db94ad80e4"} Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.947617 4776 generic.go:334] "Generic (PLEG): container finished" podID="3cdce026-dfea-4328-ac53-ae69073d630e" containerID="12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753" exitCode=0 Dec 04 10:27:14 crc kubenswrapper[4776]: I1204 10:27:14.947665 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerDied","Data":"12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753"} Dec 04 10:27:17 crc kubenswrapper[4776]: I1204 10:27:17.979244 4776 generic.go:334] "Generic (PLEG): container finished" podID="111a5322-ba8a-4efa-b746-a1827418badd" containerID="ffe2a476b316ec5fc0e77cd8ab70c3e920377160a1b443a8c4e1ec5b9388129d" exitCode=0 Dec 04 10:27:17 crc kubenswrapper[4776]: I1204 10:27:17.979371 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b27jj" event={"ID":"111a5322-ba8a-4efa-b746-a1827418badd","Type":"ContainerDied","Data":"ffe2a476b316ec5fc0e77cd8ab70c3e920377160a1b443a8c4e1ec5b9388129d"} Dec 04 10:27:19 crc kubenswrapper[4776]: I1204 10:27:19.000703 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b27jj" event={"ID":"111a5322-ba8a-4efa-b746-a1827418badd","Type":"ContainerStarted","Data":"b3fa1cd59236c5a4fa9cf9c4ca24ac85ceaca5a731b39ca33f1aacc79542dc7e"} Dec 04 10:27:19 crc kubenswrapper[4776]: I1204 10:27:19.020996 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b27jj" podStartSLOduration=2.324612297 podStartE2EDuration="6.020977222s" podCreationTimestamp="2025-12-04 10:27:13 +0000 UTC" firstStartedPulling="2025-12-04 10:27:14.946349698 +0000 UTC m=+2879.812830065" lastFinishedPulling="2025-12-04 10:27:18.642714583 +0000 UTC m=+2883.509194990" observedRunningTime="2025-12-04 10:27:19.016141609 +0000 UTC m=+2883.882622006" watchObservedRunningTime="2025-12-04 10:27:19.020977222 +0000 UTC m=+2883.887457599" Dec 04 10:27:19 crc kubenswrapper[4776]: I1204 10:27:19.380122 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:27:19 crc kubenswrapper[4776]: I1204 10:27:19.380200 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:27:21 crc kubenswrapper[4776]: I1204 10:27:21.019088 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerStarted","Data":"91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854"} Dec 04 10:27:24 crc kubenswrapper[4776]: I1204 10:27:24.047009 4776 generic.go:334] "Generic (PLEG): container finished" podID="3cdce026-dfea-4328-ac53-ae69073d630e" containerID="91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854" exitCode=0 Dec 04 10:27:24 crc kubenswrapper[4776]: I1204 10:27:24.047090 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerDied","Data":"91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854"} Dec 04 10:27:24 crc kubenswrapper[4776]: I1204 10:27:24.098529 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:24 crc kubenswrapper[4776]: I1204 10:27:24.098577 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:24 crc kubenswrapper[4776]: I1204 10:27:24.142016 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:25 crc kubenswrapper[4776]: I1204 10:27:25.107969 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:26 crc kubenswrapper[4776]: I1204 10:27:26.077408 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerStarted","Data":"09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8"} Dec 04 10:27:26 crc kubenswrapper[4776]: I1204 10:27:26.105367 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kj8z2" podStartSLOduration=3.55688123 podStartE2EDuration="14.105347366s" podCreationTimestamp="2025-12-04 10:27:12 +0000 UTC" firstStartedPulling="2025-12-04 10:27:14.950107697 +0000 UTC m=+2879.816588074" lastFinishedPulling="2025-12-04 10:27:25.498573813 +0000 UTC m=+2890.365054210" observedRunningTime="2025-12-04 10:27:26.101799835 +0000 UTC m=+2890.968280232" watchObservedRunningTime="2025-12-04 10:27:26.105347366 +0000 UTC m=+2890.971827743" Dec 04 10:27:27 crc kubenswrapper[4776]: I1204 10:27:27.542653 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b27jj"] Dec 04 10:27:27 crc kubenswrapper[4776]: I1204 10:27:27.542964 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b27jj" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="registry-server" containerID="cri-o://b3fa1cd59236c5a4fa9cf9c4ca24ac85ceaca5a731b39ca33f1aacc79542dc7e" gracePeriod=2 Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.109622 4776 generic.go:334] "Generic (PLEG): container finished" podID="111a5322-ba8a-4efa-b746-a1827418badd" containerID="b3fa1cd59236c5a4fa9cf9c4ca24ac85ceaca5a731b39ca33f1aacc79542dc7e" exitCode=0 Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.110264 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b27jj" event={"ID":"111a5322-ba8a-4efa-b746-a1827418badd","Type":"ContainerDied","Data":"b3fa1cd59236c5a4fa9cf9c4ca24ac85ceaca5a731b39ca33f1aacc79542dc7e"} Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.181777 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.327067 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-catalog-content\") pod \"111a5322-ba8a-4efa-b746-a1827418badd\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.327238 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-utilities\") pod \"111a5322-ba8a-4efa-b746-a1827418badd\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.327410 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw668\" (UniqueName: \"kubernetes.io/projected/111a5322-ba8a-4efa-b746-a1827418badd-kube-api-access-dw668\") pod \"111a5322-ba8a-4efa-b746-a1827418badd\" (UID: \"111a5322-ba8a-4efa-b746-a1827418badd\") " Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.327947 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-utilities" (OuterVolumeSpecName: "utilities") pod "111a5322-ba8a-4efa-b746-a1827418badd" (UID: "111a5322-ba8a-4efa-b746-a1827418badd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.328134 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.332428 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111a5322-ba8a-4efa-b746-a1827418badd-kube-api-access-dw668" (OuterVolumeSpecName: "kube-api-access-dw668") pod "111a5322-ba8a-4efa-b746-a1827418badd" (UID: "111a5322-ba8a-4efa-b746-a1827418badd"). InnerVolumeSpecName "kube-api-access-dw668". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.379004 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "111a5322-ba8a-4efa-b746-a1827418badd" (UID: "111a5322-ba8a-4efa-b746-a1827418badd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.429616 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw668\" (UniqueName: \"kubernetes.io/projected/111a5322-ba8a-4efa-b746-a1827418badd-kube-api-access-dw668\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:28 crc kubenswrapper[4776]: I1204 10:27:28.429658 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111a5322-ba8a-4efa-b746-a1827418badd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.120393 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b27jj" event={"ID":"111a5322-ba8a-4efa-b746-a1827418badd","Type":"ContainerDied","Data":"4e1bc2cb4d2d4bbbb689db6940c5940f51be59acb8b19094c29157db94ad80e4"} Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.120582 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b27jj" Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.120779 4776 scope.go:117] "RemoveContainer" containerID="b3fa1cd59236c5a4fa9cf9c4ca24ac85ceaca5a731b39ca33f1aacc79542dc7e" Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.141700 4776 scope.go:117] "RemoveContainer" containerID="ffe2a476b316ec5fc0e77cd8ab70c3e920377160a1b443a8c4e1ec5b9388129d" Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.159967 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b27jj"] Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.169313 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b27jj"] Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.194405 4776 scope.go:117] "RemoveContainer" containerID="fe9a912fd12fb932fe97db3e4d33a298c97c7560365d472aef31c922e76da394" Dec 04 10:27:29 crc kubenswrapper[4776]: I1204 10:27:29.463102 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111a5322-ba8a-4efa-b746-a1827418badd" path="/var/lib/kubelet/pods/111a5322-ba8a-4efa-b746-a1827418badd/volumes" Dec 04 10:27:33 crc kubenswrapper[4776]: I1204 10:27:33.301052 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:33 crc kubenswrapper[4776]: I1204 10:27:33.301490 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:33 crc kubenswrapper[4776]: I1204 10:27:33.347353 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:34 crc kubenswrapper[4776]: I1204 10:27:34.204674 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:34 crc kubenswrapper[4776]: I1204 10:27:34.260384 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kj8z2"] Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.182825 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kj8z2" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="registry-server" containerID="cri-o://09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8" gracePeriod=2 Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.697582 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.801851 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-utilities\") pod \"3cdce026-dfea-4328-ac53-ae69073d630e\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.802036 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-catalog-content\") pod \"3cdce026-dfea-4328-ac53-ae69073d630e\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.802107 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4vp\" (UniqueName: \"kubernetes.io/projected/3cdce026-dfea-4328-ac53-ae69073d630e-kube-api-access-7b4vp\") pod \"3cdce026-dfea-4328-ac53-ae69073d630e\" (UID: \"3cdce026-dfea-4328-ac53-ae69073d630e\") " Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.803796 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-utilities" (OuterVolumeSpecName: "utilities") pod "3cdce026-dfea-4328-ac53-ae69073d630e" (UID: "3cdce026-dfea-4328-ac53-ae69073d630e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.808737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cdce026-dfea-4328-ac53-ae69073d630e-kube-api-access-7b4vp" (OuterVolumeSpecName: "kube-api-access-7b4vp") pod "3cdce026-dfea-4328-ac53-ae69073d630e" (UID: "3cdce026-dfea-4328-ac53-ae69073d630e"). InnerVolumeSpecName "kube-api-access-7b4vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.904293 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.904338 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4vp\" (UniqueName: \"kubernetes.io/projected/3cdce026-dfea-4328-ac53-ae69073d630e-kube-api-access-7b4vp\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:36 crc kubenswrapper[4776]: I1204 10:27:36.908223 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cdce026-dfea-4328-ac53-ae69073d630e" (UID: "3cdce026-dfea-4328-ac53-ae69073d630e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.005912 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cdce026-dfea-4328-ac53-ae69073d630e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.191894 4776 generic.go:334] "Generic (PLEG): container finished" podID="3cdce026-dfea-4328-ac53-ae69073d630e" containerID="09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8" exitCode=0 Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.191949 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerDied","Data":"09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8"} Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.192019 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kj8z2" event={"ID":"3cdce026-dfea-4328-ac53-ae69073d630e","Type":"ContainerDied","Data":"a9210f89e8bc002ac745a81b81ccb51d776b04a213e7364df4a6466177d8c1a5"} Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.192040 4776 scope.go:117] "RemoveContainer" containerID="09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.192043 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kj8z2" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.213114 4776 scope.go:117] "RemoveContainer" containerID="91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.240984 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kj8z2"] Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.246937 4776 scope.go:117] "RemoveContainer" containerID="12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.251172 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kj8z2"] Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.284803 4776 scope.go:117] "RemoveContainer" containerID="09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8" Dec 04 10:27:37 crc kubenswrapper[4776]: E1204 10:27:37.286792 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8\": container with ID starting with 09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8 not found: ID does not exist" containerID="09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.286837 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8"} err="failed to get container status \"09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8\": rpc error: code = NotFound desc = could not find container \"09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8\": container with ID starting with 09ef0600b782a38d743d0f9404c5de6175ef553e66287c1947269f83a13d7db8 not found: ID does not exist" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.286863 4776 scope.go:117] "RemoveContainer" containerID="91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854" Dec 04 10:27:37 crc kubenswrapper[4776]: E1204 10:27:37.287384 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854\": container with ID starting with 91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854 not found: ID does not exist" containerID="91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.287425 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854"} err="failed to get container status \"91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854\": rpc error: code = NotFound desc = could not find container \"91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854\": container with ID starting with 91e42854bc10c2b4b9dd7079e5aacab1d18e4ebc5c6f3ef5fe283b529debe854 not found: ID does not exist" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.287453 4776 scope.go:117] "RemoveContainer" containerID="12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753" Dec 04 10:27:37 crc kubenswrapper[4776]: E1204 10:27:37.287935 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753\": container with ID starting with 12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753 not found: ID does not exist" containerID="12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.287962 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753"} err="failed to get container status \"12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753\": rpc error: code = NotFound desc = could not find container \"12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753\": container with ID starting with 12d38cfb0677ec4fbc249c23f4443916c6574dc9e712599d5661ce24a4ffd753 not found: ID does not exist" Dec 04 10:27:37 crc kubenswrapper[4776]: I1204 10:27:37.464210 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" path="/var/lib/kubelet/pods/3cdce026-dfea-4328-ac53-ae69073d630e/volumes" Dec 04 10:27:49 crc kubenswrapper[4776]: I1204 10:27:49.379714 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:27:49 crc kubenswrapper[4776]: I1204 10:27:49.380277 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:27:49 crc kubenswrapper[4776]: I1204 10:27:49.380327 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:27:49 crc kubenswrapper[4776]: I1204 10:27:49.381068 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:27:49 crc kubenswrapper[4776]: I1204 10:27:49.381125 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" gracePeriod=600 Dec 04 10:27:49 crc kubenswrapper[4776]: E1204 10:27:49.520564 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:27:50 crc kubenswrapper[4776]: I1204 10:27:50.300290 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" exitCode=0 Dec 04 10:27:50 crc kubenswrapper[4776]: I1204 10:27:50.300414 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0"} Dec 04 10:27:50 crc kubenswrapper[4776]: I1204 10:27:50.300758 4776 scope.go:117] "RemoveContainer" containerID="3b3694b7bd99eb3bb73428b2f0d0a80b952a000625fa6e6dc456d1e286636470" Dec 04 10:27:50 crc kubenswrapper[4776]: I1204 10:27:50.302008 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:27:50 crc kubenswrapper[4776]: E1204 10:27:50.302424 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:28:04 crc kubenswrapper[4776]: I1204 10:28:04.453246 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:28:04 crc kubenswrapper[4776]: E1204 10:28:04.453932 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:28:16 crc kubenswrapper[4776]: I1204 10:28:16.452987 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:28:16 crc kubenswrapper[4776]: E1204 10:28:16.454374 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:28:28 crc kubenswrapper[4776]: I1204 10:28:28.453434 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:28:28 crc kubenswrapper[4776]: E1204 10:28:28.454620 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:28:42 crc kubenswrapper[4776]: I1204 10:28:42.452574 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:28:42 crc kubenswrapper[4776]: E1204 10:28:42.453429 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:28:56 crc kubenswrapper[4776]: I1204 10:28:56.453236 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:28:56 crc kubenswrapper[4776]: E1204 10:28:56.454301 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:29:11 crc kubenswrapper[4776]: I1204 10:29:11.452138 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:29:11 crc kubenswrapper[4776]: E1204 10:29:11.453004 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:29:25 crc kubenswrapper[4776]: I1204 10:29:25.467990 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:29:25 crc kubenswrapper[4776]: E1204 10:29:25.469265 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:29:37 crc kubenswrapper[4776]: I1204 10:29:37.452312 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:29:37 crc kubenswrapper[4776]: E1204 10:29:37.453210 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:29:52 crc kubenswrapper[4776]: I1204 10:29:52.453143 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:29:52 crc kubenswrapper[4776]: E1204 10:29:52.454156 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.156490 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr"] Dec 04 10:30:00 crc kubenswrapper[4776]: E1204 10:30:00.157508 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157528 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4776]: E1204 10:30:00.157547 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="extract-content" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157553 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="extract-content" Dec 04 10:30:00 crc kubenswrapper[4776]: E1204 10:30:00.157563 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="extract-utilities" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157570 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="extract-utilities" Dec 04 10:30:00 crc kubenswrapper[4776]: E1204 10:30:00.157599 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157606 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4776]: E1204 10:30:00.157621 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="extract-content" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157628 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="extract-content" Dec 04 10:30:00 crc kubenswrapper[4776]: E1204 10:30:00.157656 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="extract-utilities" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157662 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="extract-utilities" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157842 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="111a5322-ba8a-4efa-b746-a1827418badd" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.157864 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cdce026-dfea-4328-ac53-ae69073d630e" containerName="registry-server" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.158577 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.160963 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.161065 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.165696 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr"] Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.198960 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58scj\" (UniqueName: \"kubernetes.io/projected/067b6bca-0de1-48a5-8627-e8b8543e7860-kube-api-access-58scj\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.199108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/067b6bca-0de1-48a5-8627-e8b8543e7860-config-volume\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.199194 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/067b6bca-0de1-48a5-8627-e8b8543e7860-secret-volume\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.300778 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58scj\" (UniqueName: \"kubernetes.io/projected/067b6bca-0de1-48a5-8627-e8b8543e7860-kube-api-access-58scj\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.300868 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/067b6bca-0de1-48a5-8627-e8b8543e7860-config-volume\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.300982 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/067b6bca-0de1-48a5-8627-e8b8543e7860-secret-volume\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.302162 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/067b6bca-0de1-48a5-8627-e8b8543e7860-config-volume\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.310597 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/067b6bca-0de1-48a5-8627-e8b8543e7860-secret-volume\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.319238 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58scj\" (UniqueName: \"kubernetes.io/projected/067b6bca-0de1-48a5-8627-e8b8543e7860-kube-api-access-58scj\") pod \"collect-profiles-29414070-wcsvr\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:00 crc kubenswrapper[4776]: I1204 10:30:00.481866 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:01 crc kubenswrapper[4776]: I1204 10:30:01.119850 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr"] Dec 04 10:30:01 crc kubenswrapper[4776]: I1204 10:30:01.854237 4776 generic.go:334] "Generic (PLEG): container finished" podID="067b6bca-0de1-48a5-8627-e8b8543e7860" containerID="bd55a73ff2944741393a0f1b0aafa6f652ead887a5b9644b9028c210b3f0ed49" exitCode=0 Dec 04 10:30:01 crc kubenswrapper[4776]: I1204 10:30:01.854286 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" event={"ID":"067b6bca-0de1-48a5-8627-e8b8543e7860","Type":"ContainerDied","Data":"bd55a73ff2944741393a0f1b0aafa6f652ead887a5b9644b9028c210b3f0ed49"} Dec 04 10:30:01 crc kubenswrapper[4776]: I1204 10:30:01.854321 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" event={"ID":"067b6bca-0de1-48a5-8627-e8b8543e7860","Type":"ContainerStarted","Data":"22a74f3b0cedb637a2df528699ebd6381e38ccd905d61039f8fe472d4d28a39d"} Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.209331 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.374606 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58scj\" (UniqueName: \"kubernetes.io/projected/067b6bca-0de1-48a5-8627-e8b8543e7860-kube-api-access-58scj\") pod \"067b6bca-0de1-48a5-8627-e8b8543e7860\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.374690 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/067b6bca-0de1-48a5-8627-e8b8543e7860-secret-volume\") pod \"067b6bca-0de1-48a5-8627-e8b8543e7860\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.374731 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/067b6bca-0de1-48a5-8627-e8b8543e7860-config-volume\") pod \"067b6bca-0de1-48a5-8627-e8b8543e7860\" (UID: \"067b6bca-0de1-48a5-8627-e8b8543e7860\") " Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.375642 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067b6bca-0de1-48a5-8627-e8b8543e7860-config-volume" (OuterVolumeSpecName: "config-volume") pod "067b6bca-0de1-48a5-8627-e8b8543e7860" (UID: "067b6bca-0de1-48a5-8627-e8b8543e7860"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.381738 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067b6bca-0de1-48a5-8627-e8b8543e7860-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "067b6bca-0de1-48a5-8627-e8b8543e7860" (UID: "067b6bca-0de1-48a5-8627-e8b8543e7860"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.382084 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067b6bca-0de1-48a5-8627-e8b8543e7860-kube-api-access-58scj" (OuterVolumeSpecName: "kube-api-access-58scj") pod "067b6bca-0de1-48a5-8627-e8b8543e7860" (UID: "067b6bca-0de1-48a5-8627-e8b8543e7860"). InnerVolumeSpecName "kube-api-access-58scj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.477270 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/067b6bca-0de1-48a5-8627-e8b8543e7860-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.477319 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/067b6bca-0de1-48a5-8627-e8b8543e7860-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.477332 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58scj\" (UniqueName: \"kubernetes.io/projected/067b6bca-0de1-48a5-8627-e8b8543e7860-kube-api-access-58scj\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.883801 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.883847 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414070-wcsvr" event={"ID":"067b6bca-0de1-48a5-8627-e8b8543e7860","Type":"ContainerDied","Data":"22a74f3b0cedb637a2df528699ebd6381e38ccd905d61039f8fe472d4d28a39d"} Dec 04 10:30:03 crc kubenswrapper[4776]: I1204 10:30:03.883933 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a74f3b0cedb637a2df528699ebd6381e38ccd905d61039f8fe472d4d28a39d" Dec 04 10:30:04 crc kubenswrapper[4776]: I1204 10:30:04.292266 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt"] Dec 04 10:30:04 crc kubenswrapper[4776]: I1204 10:30:04.299596 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414025-jmfgt"] Dec 04 10:30:05 crc kubenswrapper[4776]: I1204 10:30:05.461549 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:30:05 crc kubenswrapper[4776]: E1204 10:30:05.462157 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:30:05 crc kubenswrapper[4776]: I1204 10:30:05.468838 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf991e7-ff27-441a-b83f-a70a66455185" path="/var/lib/kubelet/pods/3cf991e7-ff27-441a-b83f-a70a66455185/volumes" Dec 04 10:30:19 crc kubenswrapper[4776]: I1204 10:30:19.452634 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:30:19 crc kubenswrapper[4776]: E1204 10:30:19.453426 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:30:22 crc kubenswrapper[4776]: I1204 10:30:22.864587 4776 scope.go:117] "RemoveContainer" containerID="376f5b5e11af2c4defbeec654c99370acc674280766a7cdd9957440bad5b85c2" Dec 04 10:30:30 crc kubenswrapper[4776]: I1204 10:30:30.118592 4776 generic.go:334] "Generic (PLEG): container finished" podID="c95fc34d-f4d9-45d9-acf3-a4fb114a972e" containerID="981eec68edd56d698414e8b1726d95e1632167f0aafba3f23a48bde0afa95a6e" exitCode=0 Dec 04 10:30:30 crc kubenswrapper[4776]: I1204 10:30:30.118698 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" event={"ID":"c95fc34d-f4d9-45d9-acf3-a4fb114a972e","Type":"ContainerDied","Data":"981eec68edd56d698414e8b1726d95e1632167f0aafba3f23a48bde0afa95a6e"} Dec 04 10:30:30 crc kubenswrapper[4776]: I1204 10:30:30.453466 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:30:30 crc kubenswrapper[4776]: E1204 10:30:30.453699 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.555948 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.669878 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-combined-ca-bundle\") pod \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.670016 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj5s4\" (UniqueName: \"kubernetes.io/projected/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-kube-api-access-bj5s4\") pod \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.670131 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ceph\") pod \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.670171 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ssh-key\") pod \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.670239 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-inventory\") pod \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.670280 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-secret-0\") pod \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\" (UID: \"c95fc34d-f4d9-45d9-acf3-a4fb114a972e\") " Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.676484 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-kube-api-access-bj5s4" (OuterVolumeSpecName: "kube-api-access-bj5s4") pod "c95fc34d-f4d9-45d9-acf3-a4fb114a972e" (UID: "c95fc34d-f4d9-45d9-acf3-a4fb114a972e"). InnerVolumeSpecName "kube-api-access-bj5s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.677003 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c95fc34d-f4d9-45d9-acf3-a4fb114a972e" (UID: "c95fc34d-f4d9-45d9-acf3-a4fb114a972e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.681058 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ceph" (OuterVolumeSpecName: "ceph") pod "c95fc34d-f4d9-45d9-acf3-a4fb114a972e" (UID: "c95fc34d-f4d9-45d9-acf3-a4fb114a972e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.698269 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-inventory" (OuterVolumeSpecName: "inventory") pod "c95fc34d-f4d9-45d9-acf3-a4fb114a972e" (UID: "c95fc34d-f4d9-45d9-acf3-a4fb114a972e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.702044 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c95fc34d-f4d9-45d9-acf3-a4fb114a972e" (UID: "c95fc34d-f4d9-45d9-acf3-a4fb114a972e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.703616 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c95fc34d-f4d9-45d9-acf3-a4fb114a972e" (UID: "c95fc34d-f4d9-45d9-acf3-a4fb114a972e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.773314 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.773883 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.773904 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.773939 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.773956 4776 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:31 crc kubenswrapper[4776]: I1204 10:30:31.773974 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj5s4\" (UniqueName: \"kubernetes.io/projected/c95fc34d-f4d9-45d9-acf3-a4fb114a972e-kube-api-access-bj5s4\") on node \"crc\" DevicePath \"\"" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.146347 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" event={"ID":"c95fc34d-f4d9-45d9-acf3-a4fb114a972e","Type":"ContainerDied","Data":"21f515e87c74c15096ee1c919ddca8e60d046f193a1d8ecc3be1b23a0cdea3f7"} Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.146392 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f515e87c74c15096ee1c919ddca8e60d046f193a1d8ecc3be1b23a0cdea3f7" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.146449 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.245772 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9"] Dec 04 10:30:32 crc kubenswrapper[4776]: E1204 10:30:32.246128 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95fc34d-f4d9-45d9-acf3-a4fb114a972e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.246146 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95fc34d-f4d9-45d9-acf3-a4fb114a972e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:30:32 crc kubenswrapper[4776]: E1204 10:30:32.246162 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067b6bca-0de1-48a5-8627-e8b8543e7860" containerName="collect-profiles" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.246173 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="067b6bca-0de1-48a5-8627-e8b8543e7860" containerName="collect-profiles" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.246368 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="067b6bca-0de1-48a5-8627-e8b8543e7860" containerName="collect-profiles" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.246388 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c95fc34d-f4d9-45d9-acf3-a4fb114a972e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.247798 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.250904 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.251029 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6sjk6" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.251035 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.251073 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.251098 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.251821 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.252851 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.253239 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.253355 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.255764 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9"] Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385045 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385099 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385134 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385178 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385198 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385330 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385350 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385373 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385527 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.385732 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76pb\" (UniqueName: \"kubernetes.io/projected/700d0cc0-f03a-47f4-bb74-d727bda5f904-kube-api-access-h76pb\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515332 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515417 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515508 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515610 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515705 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76pb\" (UniqueName: \"kubernetes.io/projected/700d0cc0-f03a-47f4-bb74-d727bda5f904-kube-api-access-h76pb\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515784 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515826 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515877 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.515985 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.516021 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.516078 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.516908 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.517222 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.520192 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.520878 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.521363 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.522379 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.527907 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.528220 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.528995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.532291 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.533826 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76pb\" (UniqueName: \"kubernetes.io/projected/700d0cc0-f03a-47f4-bb74-d727bda5f904-kube-api-access-h76pb\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:32 crc kubenswrapper[4776]: I1204 10:30:32.571276 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:30:33 crc kubenswrapper[4776]: I1204 10:30:33.093575 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9"] Dec 04 10:30:33 crc kubenswrapper[4776]: I1204 10:30:33.155686 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" event={"ID":"700d0cc0-f03a-47f4-bb74-d727bda5f904","Type":"ContainerStarted","Data":"c6b073205946ee0b72afe6c9946e61107328ffed9a7fc85395b1d4395f3d9600"} Dec 04 10:30:34 crc kubenswrapper[4776]: I1204 10:30:34.165235 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" event={"ID":"700d0cc0-f03a-47f4-bb74-d727bda5f904","Type":"ContainerStarted","Data":"1e3d1a747413417f2c45fe01ec91a9bcca9f7063947562e05247baab47325e73"} Dec 04 10:30:34 crc kubenswrapper[4776]: I1204 10:30:34.187233 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" podStartSLOduration=1.740457954 podStartE2EDuration="2.18721611s" podCreationTimestamp="2025-12-04 10:30:32 +0000 UTC" firstStartedPulling="2025-12-04 10:30:33.098956741 +0000 UTC m=+3077.965437118" lastFinishedPulling="2025-12-04 10:30:33.545714897 +0000 UTC m=+3078.412195274" observedRunningTime="2025-12-04 10:30:34.183227615 +0000 UTC m=+3079.049708012" watchObservedRunningTime="2025-12-04 10:30:34.18721611 +0000 UTC m=+3079.053696487" Dec 04 10:30:44 crc kubenswrapper[4776]: I1204 10:30:44.452400 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:30:44 crc kubenswrapper[4776]: E1204 10:30:44.453224 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:30:58 crc kubenswrapper[4776]: I1204 10:30:58.452626 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:30:58 crc kubenswrapper[4776]: E1204 10:30:58.453588 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:31:13 crc kubenswrapper[4776]: I1204 10:31:13.452881 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:31:13 crc kubenswrapper[4776]: E1204 10:31:13.453677 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:31:28 crc kubenswrapper[4776]: I1204 10:31:28.452318 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:31:28 crc kubenswrapper[4776]: E1204 10:31:28.453155 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:31:41 crc kubenswrapper[4776]: I1204 10:31:41.452227 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:31:41 crc kubenswrapper[4776]: E1204 10:31:41.453252 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:31:52 crc kubenswrapper[4776]: I1204 10:31:52.452317 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:31:52 crc kubenswrapper[4776]: E1204 10:31:52.453078 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:32:03 crc kubenswrapper[4776]: I1204 10:32:03.452706 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:32:03 crc kubenswrapper[4776]: E1204 10:32:03.453736 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:32:15 crc kubenswrapper[4776]: I1204 10:32:15.465877 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:32:15 crc kubenswrapper[4776]: E1204 10:32:15.467172 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:32:28 crc kubenswrapper[4776]: I1204 10:32:28.452876 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:32:28 crc kubenswrapper[4776]: E1204 10:32:28.453796 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:32:41 crc kubenswrapper[4776]: I1204 10:32:41.452326 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:32:41 crc kubenswrapper[4776]: E1204 10:32:41.453635 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:32:56 crc kubenswrapper[4776]: I1204 10:32:56.452832 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:32:57 crc kubenswrapper[4776]: I1204 10:32:57.223604 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"28b06bab60fad7f595377c61d0d33cf6466b8ad6aa3300c1a8a4c45dfe1ba590"} Dec 04 10:33:35 crc kubenswrapper[4776]: I1204 10:33:35.561607 4776 generic.go:334] "Generic (PLEG): container finished" podID="700d0cc0-f03a-47f4-bb74-d727bda5f904" containerID="1e3d1a747413417f2c45fe01ec91a9bcca9f7063947562e05247baab47325e73" exitCode=0 Dec 04 10:33:35 crc kubenswrapper[4776]: I1204 10:33:35.561634 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" event={"ID":"700d0cc0-f03a-47f4-bb74-d727bda5f904","Type":"ContainerDied","Data":"1e3d1a747413417f2c45fe01ec91a9bcca9f7063947562e05247baab47325e73"} Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.983411 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.989769 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.989811 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h76pb\" (UniqueName: \"kubernetes.io/projected/700d0cc0-f03a-47f4-bb74-d727bda5f904-kube-api-access-h76pb\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.989847 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-custom-ceph-combined-ca-bundle\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.989882 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-1\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.989957 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-inventory\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.989987 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-0\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.990010 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph-nova-0\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.990033 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-1\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.990091 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ssh-key\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.990115 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-0\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.990146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-extra-config-0\") pod \"700d0cc0-f03a-47f4-bb74-d727bda5f904\" (UID: \"700d0cc0-f03a-47f4-bb74-d727bda5f904\") " Dec 04 10:33:36 crc kubenswrapper[4776]: I1204 10:33:36.996894 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700d0cc0-f03a-47f4-bb74-d727bda5f904-kube-api-access-h76pb" (OuterVolumeSpecName: "kube-api-access-h76pb") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "kube-api-access-h76pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.061161 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph" (OuterVolumeSpecName: "ceph") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.061212 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.072899 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.073142 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.073745 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.074211 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.075140 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-inventory" (OuterVolumeSpecName: "inventory") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.075166 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.086294 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.091136 4776 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.091535 4776 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.091727 4776 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.091801 4776 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.091864 4776 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.091935 4776 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.092011 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.092079 4776 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.092140 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/700d0cc0-f03a-47f4-bb74-d727bda5f904-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.092195 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h76pb\" (UniqueName: \"kubernetes.io/projected/700d0cc0-f03a-47f4-bb74-d727bda5f904-kube-api-access-h76pb\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.100552 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "700d0cc0-f03a-47f4-bb74-d727bda5f904" (UID: "700d0cc0-f03a-47f4-bb74-d727bda5f904"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.194439 4776 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/700d0cc0-f03a-47f4-bb74-d727bda5f904-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.579151 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" event={"ID":"700d0cc0-f03a-47f4-bb74-d727bda5f904","Type":"ContainerDied","Data":"c6b073205946ee0b72afe6c9946e61107328ffed9a7fc85395b1d4395f3d9600"} Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.579193 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b073205946ee0b72afe6c9946e61107328ffed9a7fc85395b1d4395f3d9600" Dec 04 10:33:37 crc kubenswrapper[4776]: I1204 10:33:37.579195 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.329095 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:33:53 crc kubenswrapper[4776]: E1204 10:33:53.330077 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700d0cc0-f03a-47f4-bb74-d727bda5f904" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.330093 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="700d0cc0-f03a-47f4-bb74-d727bda5f904" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.330296 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="700d0cc0-f03a-47f4-bb74-d727bda5f904" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.331291 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.336121 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.340082 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.346679 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.349016 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.355303 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.365767 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.390806 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501053 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-lib-modules\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501112 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-dev\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501161 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-scripts\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501184 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501207 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501233 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501255 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-config-data-custom\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501282 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501299 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-run\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-config-data\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501387 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501405 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501432 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501470 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501494 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55871647-5a7a-4fbf-954e-67418476628e-ceph\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501520 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8933a5ae-42ff-44b3-bd28-38a424729b83-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501543 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501625 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-nvme\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501669 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501694 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501721 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501742 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501761 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-sys\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501788 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501815 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501835 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-run\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501857 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h66zr\" (UniqueName: \"kubernetes.io/projected/55871647-5a7a-4fbf-954e-67418476628e-kube-api-access-h66zr\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.501880 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xcp8\" (UniqueName: \"kubernetes.io/projected/8933a5ae-42ff-44b3-bd28-38a424729b83-kube-api-access-2xcp8\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604111 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-config-data\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604142 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604165 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604238 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604391 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.604880 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605294 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605326 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605353 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55871647-5a7a-4fbf-954e-67418476628e-ceph\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605382 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8933a5ae-42ff-44b3-bd28-38a424729b83-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605410 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605823 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605896 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605906 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-nvme\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605955 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605986 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606010 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606033 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606051 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-sys\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606057 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-nvme\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.605971 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606099 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606145 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606182 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-run\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606207 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h66zr\" (UniqueName: \"kubernetes.io/projected/55871647-5a7a-4fbf-954e-67418476628e-kube-api-access-h66zr\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606235 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xcp8\" (UniqueName: \"kubernetes.io/projected/8933a5ae-42ff-44b3-bd28-38a424729b83-kube-api-access-2xcp8\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606274 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-lib-modules\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606297 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-dev\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606344 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606364 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-scripts\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606394 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606430 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606462 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-config-data-custom\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606496 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606556 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-run\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606645 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-run\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606736 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606770 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606801 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-sys\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606875 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606908 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.606949 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-run\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.607118 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.607159 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.607247 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.607283 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.607308 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-dev\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.607334 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55871647-5a7a-4fbf-954e-67418476628e-lib-modules\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.609887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8933a5ae-42ff-44b3-bd28-38a424729b83-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.611396 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.615005 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-config-data\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.619683 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-scripts\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.619790 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8933a5ae-42ff-44b3-bd28-38a424729b83-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.619882 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55871647-5a7a-4fbf-954e-67418476628e-ceph\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.620035 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55871647-5a7a-4fbf-954e-67418476628e-config-data-custom\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.620038 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.620322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.622462 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.625133 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933a5ae-42ff-44b3-bd28-38a424729b83-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.626382 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h66zr\" (UniqueName: \"kubernetes.io/projected/55871647-5a7a-4fbf-954e-67418476628e-kube-api-access-h66zr\") pod \"cinder-backup-0\" (UID: \"55871647-5a7a-4fbf-954e-67418476628e\") " pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.631595 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xcp8\" (UniqueName: \"kubernetes.io/projected/8933a5ae-42ff-44b3-bd28-38a424729b83-kube-api-access-2xcp8\") pod \"cinder-volume-volume1-0\" (UID: \"8933a5ae-42ff-44b3-bd28-38a424729b83\") " pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.673746 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.681798 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.936072 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-xz78f"] Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.938050 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xz78f" Dec 04 10:33:53 crc kubenswrapper[4776]: I1204 10:33:53.974087 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-xz78f"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.016044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfqs\" (UniqueName: \"kubernetes.io/projected/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-kube-api-access-fvfqs\") pod \"manila-db-create-xz78f\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.016090 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-operator-scripts\") pod \"manila-db-create-xz78f\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.044832 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-a778-account-create-update-4tzth"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.046335 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.050116 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.071114 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-a778-account-create-update-4tzth"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.118407 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfqs\" (UniqueName: \"kubernetes.io/projected/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-kube-api-access-fvfqs\") pod \"manila-db-create-xz78f\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.118461 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-operator-scripts\") pod \"manila-db-create-xz78f\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.119507 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-operator-scripts\") pod \"manila-db-create-xz78f\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.123050 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.127254 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.146950 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.166901 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.167787 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.173221 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2jbsq" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.194839 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfqs\" (UniqueName: \"kubernetes.io/projected/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-kube-api-access-fvfqs\") pod \"manila-db-create-xz78f\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221220 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-ceph\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221297 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-config-data\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221330 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221396 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62604a5a-c38e-4972-92ee-a103a6214b3d-operator-scripts\") pod \"manila-a778-account-create-update-4tzth\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221566 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-logs\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221599 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221634 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221666 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.221973 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6fr\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-kube-api-access-dx6fr\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.230102 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.230212 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-scripts\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.230279 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kx96\" (UniqueName: \"kubernetes.io/projected/62604a5a-c38e-4972-92ee-a103a6214b3d-kube-api-access-2kx96\") pod \"manila-a778-account-create-update-4tzth\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.268691 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8474fbc5b9-ntlst"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.270697 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.280222 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4xs5g" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.280475 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.280756 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.281179 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.281622 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xz78f" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.298086 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8474fbc5b9-ntlst"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.325999 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.328286 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.330558 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331688 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-ceph\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331724 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-config-data\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331769 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-horizon-secret-key\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331810 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62604a5a-c38e-4972-92ee-a103a6214b3d-operator-scripts\") pod \"manila-a778-account-create-update-4tzth\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331829 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-logs\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331856 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-config-data\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.331894 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-scripts\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332093 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmbl\" (UniqueName: \"kubernetes.io/projected/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-kube-api-access-bhmbl\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332161 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-logs\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332187 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332222 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332249 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6fr\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-kube-api-access-dx6fr\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332271 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332295 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-scripts\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.332322 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kx96\" (UniqueName: \"kubernetes.io/projected/62604a5a-c38e-4972-92ee-a103a6214b3d-kube-api-access-2kx96\") pod \"manila-a778-account-create-update-4tzth\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.337016 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.340099 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62604a5a-c38e-4972-92ee-a103a6214b3d-operator-scripts\") pod \"manila-a778-account-create-update-4tzth\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.340420 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.343444 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-ceph\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.343554 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-logs\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.343973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.344275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.348704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-config-data\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.348785 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7847c6bcd5-qq274"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.351951 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.358318 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7847c6bcd5-qq274"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.366210 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.366342 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.368803 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-scripts\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.374595 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: E1204 10:33:54.378865 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance kube-api-access-dx6fr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="702c5cbb-ee27-4853-ad86-5a00769a2963" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.384218 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kx96\" (UniqueName: \"kubernetes.io/projected/62604a5a-c38e-4972-92ee-a103a6214b3d-kube-api-access-2kx96\") pod \"manila-a778-account-create-update-4tzth\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.400499 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6fr\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-kube-api-access-dx6fr\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.434501 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-logs\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.434974 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435000 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-config-data\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435032 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lr7\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-kube-api-access-l2lr7\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435053 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435093 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmbl\" (UniqueName: \"kubernetes.io/projected/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-kube-api-access-bhmbl\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435116 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-scripts\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435161 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576f02c-3bb8-4a18-a9b8-464e2bf22947-logs\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435198 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435224 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435257 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435280 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-config-data\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435312 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0576f02c-3bb8-4a18-a9b8-464e2bf22947-horizon-secret-key\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435341 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv65\" (UniqueName: \"kubernetes.io/projected/0576f02c-3bb8-4a18-a9b8-464e2bf22947-kube-api-access-znv65\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435360 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435405 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435425 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435447 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-horizon-secret-key\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.435468 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-scripts\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.445796 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-logs\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.446510 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-scripts\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.447499 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-config-data\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.448226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.451598 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-horizon-secret-key\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.465652 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.476428 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.483969 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmbl\" (UniqueName: \"kubernetes.io/projected/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-kube-api-access-bhmbl\") pod \"horizon-8474fbc5b9-ntlst\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547164 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547272 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547303 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-config-data\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547354 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0576f02c-3bb8-4a18-a9b8-464e2bf22947-horizon-secret-key\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547402 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv65\" (UniqueName: \"kubernetes.io/projected/0576f02c-3bb8-4a18-a9b8-464e2bf22947-kube-api-access-znv65\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547427 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547479 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547504 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547536 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-scripts\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547586 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547627 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2lr7\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-kube-api-access-l2lr7\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547663 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.547711 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576f02c-3bb8-4a18-a9b8-464e2bf22947-logs\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.548217 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576f02c-3bb8-4a18-a9b8-464e2bf22947-logs\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.548460 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.549199 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-scripts\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.550881 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.552103 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.555116 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-config-data\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.555728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0576f02c-3bb8-4a18-a9b8-464e2bf22947-horizon-secret-key\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.571562 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.571986 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.575117 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv65\" (UniqueName: \"kubernetes.io/projected/0576f02c-3bb8-4a18-a9b8-464e2bf22947-kube-api-access-znv65\") pod \"horizon-7847c6bcd5-qq274\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.577718 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.586325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.588271 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2lr7\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-kube-api-access-l2lr7\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.597140 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.602990 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.624578 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.638046 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.671027 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.693311 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.705224 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.707537 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-xz78f"] Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.762010 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"55871647-5a7a-4fbf-954e-67418476628e","Type":"ContainerStarted","Data":"50154660f61abcad6d91c435d4b7a4dd2774de818d16efc62cb067b7b07b4021"} Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.772647 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.772888 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8933a5ae-42ff-44b3-bd28-38a424729b83","Type":"ContainerStarted","Data":"c363543ce8cd66d3d194414f90ecade6cb785ef3ffc2891ec0ce033b06c97df1"} Dec 04 10:33:54 crc kubenswrapper[4776]: I1204 10:33:54.945794 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063563 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063673 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx6fr\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-kube-api-access-dx6fr\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063710 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-scripts\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063751 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-httpd-run\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063791 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-config-data\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063953 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-logs\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.063993 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-combined-ca-bundle\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.064067 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-ceph\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.064104 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-public-tls-certs\") pod \"702c5cbb-ee27-4853-ad86-5a00769a2963\" (UID: \"702c5cbb-ee27-4853-ad86-5a00769a2963\") " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.065780 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.076545 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-config-data" (OuterVolumeSpecName: "config-data") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.076825 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-ceph" (OuterVolumeSpecName: "ceph") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.077497 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-logs" (OuterVolumeSpecName: "logs") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.079008 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-scripts" (OuterVolumeSpecName: "scripts") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.080298 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.081798 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.082009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-kube-api-access-dx6fr" (OuterVolumeSpecName: "kube-api-access-dx6fr") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "kube-api-access-dx6fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.085350 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "702c5cbb-ee27-4853-ad86-5a00769a2963" (UID: "702c5cbb-ee27-4853-ad86-5a00769a2963"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167649 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167714 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167734 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167754 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/702c5cbb-ee27-4853-ad86-5a00769a2963-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167771 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167786 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167797 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/702c5cbb-ee27-4853-ad86-5a00769a2963-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167844 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.167857 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx6fr\" (UniqueName: \"kubernetes.io/projected/702c5cbb-ee27-4853-ad86-5a00769a2963-kube-api-access-dx6fr\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.219216 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.255069 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8474fbc5b9-ntlst"] Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.270399 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.363980 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-a778-account-create-update-4tzth"] Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.396822 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:33:55 crc kubenswrapper[4776]: W1204 10:33:55.410087 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50125512_ecde_4687_8bd7_1365e307e3f7.slice/crio-8c3ab238fdd815669b163d1d10f4c93183da6be09b10f3b1c33b61c99b432f49 WatchSource:0}: Error finding container 8c3ab238fdd815669b163d1d10f4c93183da6be09b10f3b1c33b61c99b432f49: Status 404 returned error can't find the container with id 8c3ab238fdd815669b163d1d10f4c93183da6be09b10f3b1c33b61c99b432f49 Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.429614 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7847c6bcd5-qq274"] Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.800544 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7847c6bcd5-qq274" event={"ID":"0576f02c-3bb8-4a18-a9b8-464e2bf22947","Type":"ContainerStarted","Data":"c0443deb32095c08686092562d576abd5f8612325b8ced4330621eb0517e70dd"} Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.805489 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50125512-ecde-4687-8bd7-1365e307e3f7","Type":"ContainerStarted","Data":"8c3ab238fdd815669b163d1d10f4c93183da6be09b10f3b1c33b61c99b432f49"} Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.809423 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-a778-account-create-update-4tzth" event={"ID":"62604a5a-c38e-4972-92ee-a103a6214b3d","Type":"ContainerStarted","Data":"45644a33ade47c2dd181d990b186fdf183bfdbd17ef480b3b740de7f2dd98e5c"} Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.821217 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8474fbc5b9-ntlst" event={"ID":"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8","Type":"ContainerStarted","Data":"231c192b63c81509c18864e26f95fdac0b76b801b68d21a738eab54185d4ff70"} Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.826514 4776 generic.go:334] "Generic (PLEG): container finished" podID="3de77ab0-bcef-4f39-b1f5-10ea8feddbed" containerID="82612873a29b4d392dc8016f0766b610029f1dc5bee7cac7631e657f1b1f4bc3" exitCode=0 Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.826633 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.827778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xz78f" event={"ID":"3de77ab0-bcef-4f39-b1f5-10ea8feddbed","Type":"ContainerDied","Data":"82612873a29b4d392dc8016f0766b610029f1dc5bee7cac7631e657f1b1f4bc3"} Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.827815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xz78f" event={"ID":"3de77ab0-bcef-4f39-b1f5-10ea8feddbed","Type":"ContainerStarted","Data":"cf0ba160213a9eb1b24161bf705a5ad1ce2ab4fb134128e570c004faea838221"} Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.894844 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.915488 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.925087 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.927145 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.929155 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.929320 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:33:55 crc kubenswrapper[4776]: I1204 10:33:55.943019 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.002029 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-config-data\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.002888 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-ceph\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.010991 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.011454 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-scripts\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.011579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.011997 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-logs\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.012301 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.013567 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.014779 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkwg\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-kube-api-access-jrkwg\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.116557 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkwg\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-kube-api-access-jrkwg\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.116766 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-config-data\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.116937 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-ceph\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.117027 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.117232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-scripts\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.117305 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.117445 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-logs\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.117647 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.117789 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.118246 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.118549 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.121173 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-logs\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.122804 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.122891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-ceph\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.126167 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-config-data\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.126737 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.140420 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkwg\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-kube-api-access-jrkwg\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.141849 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-scripts\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.177734 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.282219 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.837319 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50125512-ecde-4687-8bd7-1365e307e3f7","Type":"ContainerStarted","Data":"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941"} Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.838879 4776 generic.go:334] "Generic (PLEG): container finished" podID="62604a5a-c38e-4972-92ee-a103a6214b3d" containerID="9851c74bf0b4ea7fd66e142ac5d05c7130d8784d6732a058ee1926e5787ad49d" exitCode=0 Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.838934 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-a778-account-create-update-4tzth" event={"ID":"62604a5a-c38e-4972-92ee-a103a6214b3d","Type":"ContainerDied","Data":"9851c74bf0b4ea7fd66e142ac5d05c7130d8784d6732a058ee1926e5787ad49d"} Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.840720 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"55871647-5a7a-4fbf-954e-67418476628e","Type":"ContainerStarted","Data":"0e40f93dc0443154497f1e7cc473cb74ca60b6356a6c2b8218d02ad6358d72ea"} Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.840748 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"55871647-5a7a-4fbf-954e-67418476628e","Type":"ContainerStarted","Data":"f891b5f44b693c2523b4b449eb2c372f47419548898729cfc7b1f957ea66d1ec"} Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.888743 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8933a5ae-42ff-44b3-bd28-38a424729b83","Type":"ContainerStarted","Data":"f48b1603a55cb1d8a04537e68907567ab6bb5926f249617a0caf6b8c57c0fe36"} Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.889145 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8933a5ae-42ff-44b3-bd28-38a424729b83","Type":"ContainerStarted","Data":"87daee64be214f1b7440c97756922ff37b114966001d6a08890d49e2e8ed9013"} Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.950739 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7847c6bcd5-qq274"] Dec 04 10:33:56 crc kubenswrapper[4776]: I1204 10:33:56.988741 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.9227338830000003 podStartE2EDuration="3.988721395s" podCreationTimestamp="2025-12-04 10:33:53 +0000 UTC" firstStartedPulling="2025-12-04 10:33:54.644035316 +0000 UTC m=+3279.510515703" lastFinishedPulling="2025-12-04 10:33:55.710022848 +0000 UTC m=+3280.576503215" observedRunningTime="2025-12-04 10:33:56.908380406 +0000 UTC m=+3281.774860793" watchObservedRunningTime="2025-12-04 10:33:56.988721395 +0000 UTC m=+3281.855201772" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.024071 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.090056678 podStartE2EDuration="4.024047536s" podCreationTimestamp="2025-12-04 10:33:53 +0000 UTC" firstStartedPulling="2025-12-04 10:33:54.476203065 +0000 UTC m=+3279.342683432" lastFinishedPulling="2025-12-04 10:33:55.410193913 +0000 UTC m=+3280.276674290" observedRunningTime="2025-12-04 10:33:56.947756695 +0000 UTC m=+3281.814237092" watchObservedRunningTime="2025-12-04 10:33:57.024047536 +0000 UTC m=+3281.890527913" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.058139 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85fb87d5bd-kzccs"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.060716 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.067791 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.099604 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fb87d5bd-kzccs"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.170654 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.193243 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8474fbc5b9-ntlst"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.197282 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.251811 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d787f787d-lqf8p"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.257483 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.313017 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec22398-eab3-46af-8843-1c71a2f5db12-scripts\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.313128 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjr5v\" (UniqueName: \"kubernetes.io/projected/1b6d19b7-632e-4f82-8311-13e154f240f5-kube-api-access-zjr5v\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.314069 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-secret-key\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.314250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-config-data\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.314544 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-combined-ca-bundle\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.314891 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec22398-eab3-46af-8843-1c71a2f5db12-logs\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.315123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-horizon-tls-certs\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.315297 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-scripts\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.320511 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpdp7\" (UniqueName: \"kubernetes.io/projected/1ec22398-eab3-46af-8843-1c71a2f5db12-kube-api-access-jpdp7\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.320641 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-combined-ca-bundle\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.320707 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-tls-certs\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.320760 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-horizon-secret-key\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.320817 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b6d19b7-632e-4f82-8311-13e154f240f5-logs\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.321211 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ec22398-eab3-46af-8843-1c71a2f5db12-config-data\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.327694 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d787f787d-lqf8p"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.340156 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.423523 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec22398-eab3-46af-8843-1c71a2f5db12-logs\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424008 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-horizon-tls-certs\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424017 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec22398-eab3-46af-8843-1c71a2f5db12-logs\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424059 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-scripts\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424092 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpdp7\" (UniqueName: \"kubernetes.io/projected/1ec22398-eab3-46af-8843-1c71a2f5db12-kube-api-access-jpdp7\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424130 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-combined-ca-bundle\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424147 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-tls-certs\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424172 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-horizon-secret-key\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424205 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ec22398-eab3-46af-8843-1c71a2f5db12-config-data\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424226 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b6d19b7-632e-4f82-8311-13e154f240f5-logs\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424278 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec22398-eab3-46af-8843-1c71a2f5db12-scripts\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424315 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjr5v\" (UniqueName: \"kubernetes.io/projected/1b6d19b7-632e-4f82-8311-13e154f240f5-kube-api-access-zjr5v\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424372 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-secret-key\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424399 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-config-data\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424447 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-combined-ca-bundle\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.424944 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b6d19b7-632e-4f82-8311-13e154f240f5-logs\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.425961 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ec22398-eab3-46af-8843-1c71a2f5db12-config-data\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.426957 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec22398-eab3-46af-8843-1c71a2f5db12-scripts\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.427548 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-scripts\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.429355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-config-data\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.430865 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-horizon-tls-certs\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.432105 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-combined-ca-bundle\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.435345 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-secret-key\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.438228 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-tls-certs\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.438655 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-horizon-secret-key\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.446747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjr5v\" (UniqueName: \"kubernetes.io/projected/1b6d19b7-632e-4f82-8311-13e154f240f5-kube-api-access-zjr5v\") pod \"horizon-85fb87d5bd-kzccs\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.448704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec22398-eab3-46af-8843-1c71a2f5db12-combined-ca-bundle\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.453462 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpdp7\" (UniqueName: \"kubernetes.io/projected/1ec22398-eab3-46af-8843-1c71a2f5db12-kube-api-access-jpdp7\") pod \"horizon-6d787f787d-lqf8p\" (UID: \"1ec22398-eab3-46af-8843-1c71a2f5db12\") " pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.465788 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xz78f" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.467887 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702c5cbb-ee27-4853-ad86-5a00769a2963" path="/var/lib/kubelet/pods/702c5cbb-ee27-4853-ad86-5a00769a2963/volumes" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.532827 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-operator-scripts\") pod \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.533026 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvfqs\" (UniqueName: \"kubernetes.io/projected/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-kube-api-access-fvfqs\") pod \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\" (UID: \"3de77ab0-bcef-4f39-b1f5-10ea8feddbed\") " Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.533507 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3de77ab0-bcef-4f39-b1f5-10ea8feddbed" (UID: "3de77ab0-bcef-4f39-b1f5-10ea8feddbed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.534896 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.539006 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-kube-api-access-fvfqs" (OuterVolumeSpecName: "kube-api-access-fvfqs") pod "3de77ab0-bcef-4f39-b1f5-10ea8feddbed" (UID: "3de77ab0-bcef-4f39-b1f5-10ea8feddbed"). InnerVolumeSpecName "kube-api-access-fvfqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.584820 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.638874 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvfqs\" (UniqueName: \"kubernetes.io/projected/3de77ab0-bcef-4f39-b1f5-10ea8feddbed-kube-api-access-fvfqs\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.704598 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.921657 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xz78f" event={"ID":"3de77ab0-bcef-4f39-b1f5-10ea8feddbed","Type":"ContainerDied","Data":"cf0ba160213a9eb1b24161bf705a5ad1ce2ab4fb134128e570c004faea838221"} Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.922108 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf0ba160213a9eb1b24161bf705a5ad1ce2ab4fb134128e570c004faea838221" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.922205 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xz78f" Dec 04 10:33:57 crc kubenswrapper[4776]: I1204 10:33:57.982662 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07092d70-d0e2-4247-a1e6-9744f5f3b7f9","Type":"ContainerStarted","Data":"0b0549b123f4de5f739edfd0a2571e9a04e259aa043c28c24b010ac391f6b79f"} Dec 04 10:33:58 crc kubenswrapper[4776]: I1204 10:33:58.452549 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d787f787d-lqf8p"] Dec 04 10:33:58 crc kubenswrapper[4776]: I1204 10:33:58.589987 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fb87d5bd-kzccs"] Dec 04 10:33:58 crc kubenswrapper[4776]: I1204 10:33:58.674340 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 10:33:58 crc kubenswrapper[4776]: I1204 10:33:58.682484 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 04 10:33:58 crc kubenswrapper[4776]: I1204 10:33:58.908434 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.005770 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kx96\" (UniqueName: \"kubernetes.io/projected/62604a5a-c38e-4972-92ee-a103a6214b3d-kube-api-access-2kx96\") pod \"62604a5a-c38e-4972-92ee-a103a6214b3d\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.006464 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62604a5a-c38e-4972-92ee-a103a6214b3d-operator-scripts\") pod \"62604a5a-c38e-4972-92ee-a103a6214b3d\" (UID: \"62604a5a-c38e-4972-92ee-a103a6214b3d\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.009489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62604a5a-c38e-4972-92ee-a103a6214b3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62604a5a-c38e-4972-92ee-a103a6214b3d" (UID: "62604a5a-c38e-4972-92ee-a103a6214b3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.011205 4776 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62604a5a-c38e-4972-92ee-a103a6214b3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.019406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62604a5a-c38e-4972-92ee-a103a6214b3d-kube-api-access-2kx96" (OuterVolumeSpecName: "kube-api-access-2kx96") pod "62604a5a-c38e-4972-92ee-a103a6214b3d" (UID: "62604a5a-c38e-4972-92ee-a103a6214b3d"). InnerVolumeSpecName "kube-api-access-2kx96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.028573 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50125512-ecde-4687-8bd7-1365e307e3f7","Type":"ContainerStarted","Data":"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680"} Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.028843 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-log" containerID="cri-o://e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941" gracePeriod=30 Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.029946 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-httpd" containerID="cri-o://2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680" gracePeriod=30 Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.035646 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-a778-account-create-update-4tzth" event={"ID":"62604a5a-c38e-4972-92ee-a103a6214b3d","Type":"ContainerDied","Data":"45644a33ade47c2dd181d990b186fdf183bfdbd17ef480b3b740de7f2dd98e5c"} Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.035728 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45644a33ade47c2dd181d990b186fdf183bfdbd17ef480b3b740de7f2dd98e5c" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.036846 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-a778-account-create-update-4tzth" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.039589 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fb87d5bd-kzccs" event={"ID":"1b6d19b7-632e-4f82-8311-13e154f240f5","Type":"ContainerStarted","Data":"54ae4b003648727847f56a649a69e47d0ac7c6654be0557f0853ffdd0f8d877c"} Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.042089 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d787f787d-lqf8p" event={"ID":"1ec22398-eab3-46af-8843-1c71a2f5db12","Type":"ContainerStarted","Data":"2f638c669a35b52ae42abd747d770ddac9686e88fe7a9a8e393b48b9337ce04c"} Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.050893 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07092d70-d0e2-4247-a1e6-9744f5f3b7f9","Type":"ContainerStarted","Data":"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af"} Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.071135 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.071108869 podStartE2EDuration="5.071108869s" podCreationTimestamp="2025-12-04 10:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:33:59.057543273 +0000 UTC m=+3283.924023650" watchObservedRunningTime="2025-12-04 10:33:59.071108869 +0000 UTC m=+3283.937589246" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.113777 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kx96\" (UniqueName: \"kubernetes.io/projected/62604a5a-c38e-4972-92ee-a103a6214b3d-kube-api-access-2kx96\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.777810 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.832817 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2lr7\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-kube-api-access-l2lr7\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.832981 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-internal-tls-certs\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833034 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-config-data\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833143 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-combined-ca-bundle\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833177 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-scripts\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833205 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-ceph\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833221 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-logs\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833240 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-httpd-run\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.833277 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"50125512-ecde-4687-8bd7-1365e307e3f7\" (UID: \"50125512-ecde-4687-8bd7-1365e307e3f7\") " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.834402 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-logs" (OuterVolumeSpecName: "logs") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.835306 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.841009 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-kube-api-access-l2lr7" (OuterVolumeSpecName: "kube-api-access-l2lr7") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "kube-api-access-l2lr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.841367 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-scripts" (OuterVolumeSpecName: "scripts") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.841659 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.842176 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-ceph" (OuterVolumeSpecName: "ceph") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.878163 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.894591 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.905490 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-config-data" (OuterVolumeSpecName: "config-data") pod "50125512-ecde-4687-8bd7-1365e307e3f7" (UID: "50125512-ecde-4687-8bd7-1365e307e3f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.935967 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936014 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936027 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50125512-ecde-4687-8bd7-1365e307e3f7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936073 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936087 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2lr7\" (UniqueName: \"kubernetes.io/projected/50125512-ecde-4687-8bd7-1365e307e3f7-kube-api-access-l2lr7\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936099 4776 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936111 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936123 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.936133 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50125512-ecde-4687-8bd7-1365e307e3f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:33:59 crc kubenswrapper[4776]: I1204 10:33:59.964611 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.037910 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.073586 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07092d70-d0e2-4247-a1e6-9744f5f3b7f9","Type":"ContainerStarted","Data":"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f"} Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.073673 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-log" containerID="cri-o://19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af" gracePeriod=30 Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.073725 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-httpd" containerID="cri-o://3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f" gracePeriod=30 Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.077363 4776 generic.go:334] "Generic (PLEG): container finished" podID="50125512-ecde-4687-8bd7-1365e307e3f7" containerID="2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680" exitCode=0 Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.079017 4776 generic.go:334] "Generic (PLEG): container finished" podID="50125512-ecde-4687-8bd7-1365e307e3f7" containerID="e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941" exitCode=143 Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.077543 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.077546 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50125512-ecde-4687-8bd7-1365e307e3f7","Type":"ContainerDied","Data":"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680"} Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.079103 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50125512-ecde-4687-8bd7-1365e307e3f7","Type":"ContainerDied","Data":"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941"} Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.079135 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"50125512-ecde-4687-8bd7-1365e307e3f7","Type":"ContainerDied","Data":"8c3ab238fdd815669b163d1d10f4c93183da6be09b10f3b1c33b61c99b432f49"} Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.079153 4776 scope.go:117] "RemoveContainer" containerID="2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.104147 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.104131275 podStartE2EDuration="5.104131275s" podCreationTimestamp="2025-12-04 10:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:00.097519676 +0000 UTC m=+3284.964000063" watchObservedRunningTime="2025-12-04 10:34:00.104131275 +0000 UTC m=+3284.970611652" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.151299 4776 scope.go:117] "RemoveContainer" containerID="e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.163792 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.188358 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.227854 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:34:00 crc kubenswrapper[4776]: E1204 10:34:00.232683 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-log" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.232725 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-log" Dec 04 10:34:00 crc kubenswrapper[4776]: E1204 10:34:00.232740 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de77ab0-bcef-4f39-b1f5-10ea8feddbed" containerName="mariadb-database-create" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.232748 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de77ab0-bcef-4f39-b1f5-10ea8feddbed" containerName="mariadb-database-create" Dec 04 10:34:00 crc kubenswrapper[4776]: E1204 10:34:00.232768 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62604a5a-c38e-4972-92ee-a103a6214b3d" containerName="mariadb-account-create-update" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.232776 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="62604a5a-c38e-4972-92ee-a103a6214b3d" containerName="mariadb-account-create-update" Dec 04 10:34:00 crc kubenswrapper[4776]: E1204 10:34:00.232793 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-httpd" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.232800 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-httpd" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.233042 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-httpd" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.233066 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" containerName="glance-log" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.233082 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de77ab0-bcef-4f39-b1f5-10ea8feddbed" containerName="mariadb-database-create" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.233107 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="62604a5a-c38e-4972-92ee-a103a6214b3d" containerName="mariadb-account-create-update" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.243850 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.244067 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.246467 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.247625 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.347828 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhzdc\" (UniqueName: \"kubernetes.io/projected/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-kube-api-access-xhzdc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348232 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348268 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348333 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-logs\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348415 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348480 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348501 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348526 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.348571 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.349481 4776 scope.go:117] "RemoveContainer" containerID="2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680" Dec 04 10:34:00 crc kubenswrapper[4776]: E1204 10:34:00.350526 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680\": container with ID starting with 2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680 not found: ID does not exist" containerID="2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.350555 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680"} err="failed to get container status \"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680\": rpc error: code = NotFound desc = could not find container \"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680\": container with ID starting with 2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680 not found: ID does not exist" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.350577 4776 scope.go:117] "RemoveContainer" containerID="e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941" Dec 04 10:34:00 crc kubenswrapper[4776]: E1204 10:34:00.352960 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941\": container with ID starting with e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941 not found: ID does not exist" containerID="e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.353008 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941"} err="failed to get container status \"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941\": rpc error: code = NotFound desc = could not find container \"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941\": container with ID starting with e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941 not found: ID does not exist" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.353026 4776 scope.go:117] "RemoveContainer" containerID="2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.353758 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680"} err="failed to get container status \"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680\": rpc error: code = NotFound desc = could not find container \"2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680\": container with ID starting with 2dd52944bb2a6a29bcd7d82c3125e8adb6f169302450019d110b7ee53ab01680 not found: ID does not exist" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.353804 4776 scope.go:117] "RemoveContainer" containerID="e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.355280 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941"} err="failed to get container status \"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941\": rpc error: code = NotFound desc = could not find container \"e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941\": container with ID starting with e08920244226176d73646f2703fc5308ec1568963de46bc355705f35922f8941 not found: ID does not exist" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.450902 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhzdc\" (UniqueName: \"kubernetes.io/projected/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-kube-api-access-xhzdc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451029 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451064 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451126 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-logs\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451405 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451528 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451601 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-logs\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.451553 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.455563 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.455613 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.455950 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.458045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.458209 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.458470 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.460644 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.464714 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.476652 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhzdc\" (UniqueName: \"kubernetes.io/projected/7cdcbd14-b300-4a1d-b3c5-0cf70e20b290-kube-api-access-xhzdc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.507866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290\") " pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.633824 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.824168 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.865610 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-public-tls-certs\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.865702 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-combined-ca-bundle\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.865787 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-httpd-run\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.865845 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-ceph\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.865883 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-logs\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.865972 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-scripts\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.866012 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-config-data\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.866062 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrkwg\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-kube-api-access-jrkwg\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.866158 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\" (UID: \"07092d70-d0e2-4247-a1e6-9744f5f3b7f9\") " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.890471 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-logs" (OuterVolumeSpecName: "logs") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.890786 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.890954 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.896129 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-scripts" (OuterVolumeSpecName: "scripts") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.897518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-ceph" (OuterVolumeSpecName: "ceph") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.898951 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-kube-api-access-jrkwg" (OuterVolumeSpecName: "kube-api-access-jrkwg") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "kube-api-access-jrkwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.928006 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983081 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983115 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983130 4776 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983142 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983153 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983163 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.983200 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrkwg\" (UniqueName: \"kubernetes.io/projected/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-kube-api-access-jrkwg\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.994530 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-config-data" (OuterVolumeSpecName: "config-data") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:00 crc kubenswrapper[4776]: I1204 10:34:00.994616 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "07092d70-d0e2-4247-a1e6-9744f5f3b7f9" (UID: "07092d70-d0e2-4247-a1e6-9744f5f3b7f9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.026004 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.085479 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.085508 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.085521 4776 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07092d70-d0e2-4247-a1e6-9744f5f3b7f9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.094974 4776 generic.go:334] "Generic (PLEG): container finished" podID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerID="3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f" exitCode=0 Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.095001 4776 generic.go:334] "Generic (PLEG): container finished" podID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerID="19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af" exitCode=143 Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.095048 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07092d70-d0e2-4247-a1e6-9744f5f3b7f9","Type":"ContainerDied","Data":"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f"} Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.095076 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07092d70-d0e2-4247-a1e6-9744f5f3b7f9","Type":"ContainerDied","Data":"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af"} Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.095088 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07092d70-d0e2-4247-a1e6-9744f5f3b7f9","Type":"ContainerDied","Data":"0b0549b123f4de5f739edfd0a2571e9a04e259aa043c28c24b010ac391f6b79f"} Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.095106 4776 scope.go:117] "RemoveContainer" containerID="3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.095233 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.136532 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.163369 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.166264 4776 scope.go:117] "RemoveContainer" containerID="19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.176306 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:34:01 crc kubenswrapper[4776]: E1204 10:34:01.176794 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-log" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.176810 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-log" Dec 04 10:34:01 crc kubenswrapper[4776]: E1204 10:34:01.176830 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-httpd" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.176842 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-httpd" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.178456 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-httpd" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.178489 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" containerName="glance-log" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.179599 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.186732 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.187057 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.206110 4776 scope.go:117] "RemoveContainer" containerID="3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f" Dec 04 10:34:01 crc kubenswrapper[4776]: E1204 10:34:01.206622 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f\": container with ID starting with 3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f not found: ID does not exist" containerID="3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.206677 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f"} err="failed to get container status \"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f\": rpc error: code = NotFound desc = could not find container \"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f\": container with ID starting with 3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f not found: ID does not exist" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.206708 4776 scope.go:117] "RemoveContainer" containerID="19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.207677 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:34:01 crc kubenswrapper[4776]: E1204 10:34:01.207953 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af\": container with ID starting with 19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af not found: ID does not exist" containerID="19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.207975 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af"} err="failed to get container status \"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af\": rpc error: code = NotFound desc = could not find container \"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af\": container with ID starting with 19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af not found: ID does not exist" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.207990 4776 scope.go:117] "RemoveContainer" containerID="3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.208212 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f"} err="failed to get container status \"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f\": rpc error: code = NotFound desc = could not find container \"3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f\": container with ID starting with 3c4f6fccc687fe882c3f9cf6ce17f79984e47257e870beae840898b969cc944f not found: ID does not exist" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.208230 4776 scope.go:117] "RemoveContainer" containerID="19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.208638 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af"} err="failed to get container status \"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af\": rpc error: code = NotFound desc = could not find container \"19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af\": container with ID starting with 19916fa71317d0cb34f14522d5687981d291c0d4216ca1d475c4268f529398af not found: ID does not exist" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.301795 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.301846 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77d85923-9fcd-437f-b584-6e86641bccdf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.301872 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-496dg\" (UniqueName: \"kubernetes.io/projected/77d85923-9fcd-437f-b584-6e86641bccdf-kube-api-access-496dg\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.301939 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-scripts\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.302037 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.302076 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.302123 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-config-data\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.302159 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77d85923-9fcd-437f-b584-6e86641bccdf-logs\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.302252 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77d85923-9fcd-437f-b584-6e86641bccdf-ceph\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404505 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404561 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77d85923-9fcd-437f-b584-6e86641bccdf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404589 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-496dg\" (UniqueName: \"kubernetes.io/projected/77d85923-9fcd-437f-b584-6e86641bccdf-kube-api-access-496dg\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404641 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-scripts\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404716 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404761 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-config-data\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404794 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77d85923-9fcd-437f-b584-6e86641bccdf-logs\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.404884 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77d85923-9fcd-437f-b584-6e86641bccdf-ceph\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.405709 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.406124 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77d85923-9fcd-437f-b584-6e86641bccdf-logs\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.409775 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77d85923-9fcd-437f-b584-6e86641bccdf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.412045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-scripts\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.412172 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77d85923-9fcd-437f-b584-6e86641bccdf-ceph\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.412818 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.415560 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.418123 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77d85923-9fcd-437f-b584-6e86641bccdf-config-data\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.424019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-496dg\" (UniqueName: \"kubernetes.io/projected/77d85923-9fcd-437f-b584-6e86641bccdf-kube-api-access-496dg\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.447636 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"77d85923-9fcd-437f-b584-6e86641bccdf\") " pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.473062 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07092d70-d0e2-4247-a1e6-9744f5f3b7f9" path="/var/lib/kubelet/pods/07092d70-d0e2-4247-a1e6-9744f5f3b7f9/volumes" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.475482 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50125512-ecde-4687-8bd7-1365e307e3f7" path="/var/lib/kubelet/pods/50125512-ecde-4687-8bd7-1365e307e3f7/volumes" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.503498 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 10:34:01 crc kubenswrapper[4776]: I1204 10:34:01.776201 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 10:34:02 crc kubenswrapper[4776]: I1204 10:34:02.129153 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290","Type":"ContainerStarted","Data":"99c1596d3fcafd2890d491abccc8fe0946ae202143fd3ed7f22607a7b86534f9"} Dec 04 10:34:02 crc kubenswrapper[4776]: I1204 10:34:02.192182 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 10:34:03 crc kubenswrapper[4776]: I1204 10:34:03.167639 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77d85923-9fcd-437f-b584-6e86641bccdf","Type":"ContainerStarted","Data":"e54d99202ca6f077a385fc1363fe0acd4ef50ea5292b88f48a5394bc3176ee57"} Dec 04 10:34:03 crc kubenswrapper[4776]: I1204 10:34:03.914571 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 04 10:34:03 crc kubenswrapper[4776]: I1204 10:34:03.928663 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.181293 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77d85923-9fcd-437f-b584-6e86641bccdf","Type":"ContainerStarted","Data":"674dc1dfa2c3c60bbbeedb4b1840d7b09098cd21bf7a18ad9e95e3f73d4d6a19"} Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.182862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290","Type":"ContainerStarted","Data":"8e91d8e6175d262f5b5e8f4695da08d8e45029d658ec5ff5f54d6225d72434f8"} Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.522864 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-svl4r"] Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.524312 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.532641 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-sn44s" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.532877 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.538747 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-svl4r"] Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.624703 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxg4s\" (UniqueName: \"kubernetes.io/projected/7479329d-b468-4068-b9a7-3bf148b4a299-kube-api-access-vxg4s\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.624764 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-job-config-data\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.625044 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-combined-ca-bundle\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.625183 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-config-data\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.727019 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-config-data\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.727172 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxg4s\" (UniqueName: \"kubernetes.io/projected/7479329d-b468-4068-b9a7-3bf148b4a299-kube-api-access-vxg4s\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.727213 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-job-config-data\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.727336 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-combined-ca-bundle\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.735097 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-config-data\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.735128 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-combined-ca-bundle\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.746888 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxg4s\" (UniqueName: \"kubernetes.io/projected/7479329d-b468-4068-b9a7-3bf148b4a299-kube-api-access-vxg4s\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.760482 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-job-config-data\") pod \"manila-db-sync-svl4r\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:04 crc kubenswrapper[4776]: I1204 10:34:04.850075 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:08 crc kubenswrapper[4776]: I1204 10:34:08.612707 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-svl4r"] Dec 04 10:34:08 crc kubenswrapper[4776]: W1204 10:34:08.627800 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7479329d_b468_4068_b9a7_3bf148b4a299.slice/crio-898ccb293165d58dc73a675ab980ff18c288ffc4dc8e627bc364efb48b0b770d WatchSource:0}: Error finding container 898ccb293165d58dc73a675ab980ff18c288ffc4dc8e627bc364efb48b0b770d: Status 404 returned error can't find the container with id 898ccb293165d58dc73a675ab980ff18c288ffc4dc8e627bc364efb48b0b770d Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.286472 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fb87d5bd-kzccs" event={"ID":"1b6d19b7-632e-4f82-8311-13e154f240f5","Type":"ContainerStarted","Data":"d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.287263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fb87d5bd-kzccs" event={"ID":"1b6d19b7-632e-4f82-8311-13e154f240f5","Type":"ContainerStarted","Data":"09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.329109 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d787f787d-lqf8p" event={"ID":"1ec22398-eab3-46af-8843-1c71a2f5db12","Type":"ContainerStarted","Data":"c9d4e02b3ee2dc4e90a2f4bef574285bf97303621ba15ef87e12f4f4d5740303"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.329157 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d787f787d-lqf8p" event={"ID":"1ec22398-eab3-46af-8843-1c71a2f5db12","Type":"ContainerStarted","Data":"7e1dfe7824beb27adedf5a1ae920712239af99a4388006dcb5f781b9285bba26"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.337760 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85fb87d5bd-kzccs" podStartSLOduration=3.69404425 podStartE2EDuration="13.337728231s" podCreationTimestamp="2025-12-04 10:33:56 +0000 UTC" firstStartedPulling="2025-12-04 10:33:58.56997284 +0000 UTC m=+3283.436453207" lastFinishedPulling="2025-12-04 10:34:08.213656801 +0000 UTC m=+3293.080137188" observedRunningTime="2025-12-04 10:34:09.336311567 +0000 UTC m=+3294.202791944" watchObservedRunningTime="2025-12-04 10:34:09.337728231 +0000 UTC m=+3294.204208608" Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.338840 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8474fbc5b9-ntlst" event={"ID":"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8","Type":"ContainerStarted","Data":"6fcf399226a4fc4ec52cfdefa1afcece04229235449e8bc74536213d5e17186f"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.338897 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8474fbc5b9-ntlst" event={"ID":"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8","Type":"ContainerStarted","Data":"8d9c3caa3c5c107fd5342c9cf66fef4fa9f31b2fa76058e645052b20dc001970"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.339012 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8474fbc5b9-ntlst" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon" containerID="cri-o://6fcf399226a4fc4ec52cfdefa1afcece04229235449e8bc74536213d5e17186f" gracePeriod=30 Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.339003 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8474fbc5b9-ntlst" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon-log" containerID="cri-o://8d9c3caa3c5c107fd5342c9cf66fef4fa9f31b2fa76058e645052b20dc001970" gracePeriod=30 Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.368363 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"77d85923-9fcd-437f-b584-6e86641bccdf","Type":"ContainerStarted","Data":"7f0489b4dc2ac74d4ba9150cbe9e202e543d4a80e6e5f153d5f3cf5f9eac701f"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.381283 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7cdcbd14-b300-4a1d-b3c5-0cf70e20b290","Type":"ContainerStarted","Data":"d83d03eef56d3f9c04ff87a2318baf3889312953782c43c1ec7caf349750602b"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.384028 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-svl4r" event={"ID":"7479329d-b468-4068-b9a7-3bf148b4a299","Type":"ContainerStarted","Data":"898ccb293165d58dc73a675ab980ff18c288ffc4dc8e627bc364efb48b0b770d"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.387708 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7847c6bcd5-qq274" event={"ID":"0576f02c-3bb8-4a18-a9b8-464e2bf22947","Type":"ContainerStarted","Data":"5753c1c176c7d6f3d943980b5f5fa32a9c7a77fcc6ce0db4988d1aca2dc06d72"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.387752 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7847c6bcd5-qq274" event={"ID":"0576f02c-3bb8-4a18-a9b8-464e2bf22947","Type":"ContainerStarted","Data":"1c00d342d813650c9915c89faa42b262c2180b4711f1485ba2753ce19281843d"} Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.387925 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7847c6bcd5-qq274" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon-log" containerID="cri-o://1c00d342d813650c9915c89faa42b262c2180b4711f1485ba2753ce19281843d" gracePeriod=30 Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.388534 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7847c6bcd5-qq274" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon" containerID="cri-o://5753c1c176c7d6f3d943980b5f5fa32a9c7a77fcc6ce0db4988d1aca2dc06d72" gracePeriod=30 Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.389987 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d787f787d-lqf8p" podStartSLOduration=2.561072137 podStartE2EDuration="12.389963325s" podCreationTimestamp="2025-12-04 10:33:57 +0000 UTC" firstStartedPulling="2025-12-04 10:33:58.487399252 +0000 UTC m=+3283.353879629" lastFinishedPulling="2025-12-04 10:34:08.31629044 +0000 UTC m=+3293.182770817" observedRunningTime="2025-12-04 10:34:09.369731048 +0000 UTC m=+3294.236211625" watchObservedRunningTime="2025-12-04 10:34:09.389963325 +0000 UTC m=+3294.256443702" Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.411291 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8474fbc5b9-ntlst" podStartSLOduration=2.470320241 podStartE2EDuration="15.411255485s" podCreationTimestamp="2025-12-04 10:33:54 +0000 UTC" firstStartedPulling="2025-12-04 10:33:55.402278804 +0000 UTC m=+3280.268759181" lastFinishedPulling="2025-12-04 10:34:08.343214048 +0000 UTC m=+3293.209694425" observedRunningTime="2025-12-04 10:34:09.397514482 +0000 UTC m=+3294.263994859" watchObservedRunningTime="2025-12-04 10:34:09.411255485 +0000 UTC m=+3294.277735872" Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.432096 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.43206588 podStartE2EDuration="8.43206588s" podCreationTimestamp="2025-12-04 10:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:09.417831022 +0000 UTC m=+3294.284311399" watchObservedRunningTime="2025-12-04 10:34:09.43206588 +0000 UTC m=+3294.298546257" Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.455744 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7847c6bcd5-qq274" podStartSLOduration=2.658690809 podStartE2EDuration="15.455723304s" podCreationTimestamp="2025-12-04 10:33:54 +0000 UTC" firstStartedPulling="2025-12-04 10:33:55.451287937 +0000 UTC m=+3280.317768324" lastFinishedPulling="2025-12-04 10:34:08.248320442 +0000 UTC m=+3293.114800819" observedRunningTime="2025-12-04 10:34:09.438496102 +0000 UTC m=+3294.304976479" watchObservedRunningTime="2025-12-04 10:34:09.455723304 +0000 UTC m=+3294.322203671" Dec 04 10:34:09 crc kubenswrapper[4776]: I1204 10:34:09.466703 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.466688139 podStartE2EDuration="9.466688139s" podCreationTimestamp="2025-12-04 10:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:09.461510166 +0000 UTC m=+3294.327990553" watchObservedRunningTime="2025-12-04 10:34:09.466688139 +0000 UTC m=+3294.333168516" Dec 04 10:34:10 crc kubenswrapper[4776]: I1204 10:34:10.635029 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:10 crc kubenswrapper[4776]: I1204 10:34:10.635706 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:10 crc kubenswrapper[4776]: I1204 10:34:10.700018 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:10 crc kubenswrapper[4776]: I1204 10:34:10.702519 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:11 crc kubenswrapper[4776]: I1204 10:34:11.423025 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:11 crc kubenswrapper[4776]: I1204 10:34:11.423586 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:11 crc kubenswrapper[4776]: I1204 10:34:11.504108 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:34:11 crc kubenswrapper[4776]: I1204 10:34:11.504317 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 10:34:11 crc kubenswrapper[4776]: I1204 10:34:11.590629 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:34:11 crc kubenswrapper[4776]: I1204 10:34:11.623825 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 10:34:12 crc kubenswrapper[4776]: I1204 10:34:12.433141 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:34:12 crc kubenswrapper[4776]: I1204 10:34:12.433189 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 10:34:13 crc kubenswrapper[4776]: I1204 10:34:13.443430 4776 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 10:34:14 crc kubenswrapper[4776]: I1204 10:34:14.468345 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:14 crc kubenswrapper[4776]: I1204 10:34:14.567820 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:34:14 crc kubenswrapper[4776]: I1204 10:34:14.570725 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 10:34:14 crc kubenswrapper[4776]: I1204 10:34:14.604292 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:34:14 crc kubenswrapper[4776]: I1204 10:34:14.705916 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:34:16 crc kubenswrapper[4776]: I1204 10:34:16.488552 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-svl4r" event={"ID":"7479329d-b468-4068-b9a7-3bf148b4a299","Type":"ContainerStarted","Data":"f579a5e78de540f59a32c794abc8fe627d2f20223870d2dfa6a19d403ad13c26"} Dec 04 10:34:16 crc kubenswrapper[4776]: I1204 10:34:16.512714 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-svl4r" podStartSLOduration=7.139788176 podStartE2EDuration="12.512699021s" podCreationTimestamp="2025-12-04 10:34:04 +0000 UTC" firstStartedPulling="2025-12-04 10:34:08.631755617 +0000 UTC m=+3293.498235994" lastFinishedPulling="2025-12-04 10:34:14.004666472 +0000 UTC m=+3298.871146839" observedRunningTime="2025-12-04 10:34:16.511737431 +0000 UTC m=+3301.378217798" watchObservedRunningTime="2025-12-04 10:34:16.512699021 +0000 UTC m=+3301.379179398" Dec 04 10:34:17 crc kubenswrapper[4776]: I1204 10:34:17.586119 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:34:17 crc kubenswrapper[4776]: I1204 10:34:17.586437 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:34:17 crc kubenswrapper[4776]: I1204 10:34:17.705282 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:34:17 crc kubenswrapper[4776]: I1204 10:34:17.705342 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:34:18 crc kubenswrapper[4776]: I1204 10:34:18.049530 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 10:34:27 crc kubenswrapper[4776]: I1204 10:34:27.587818 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d787f787d-lqf8p" podUID="1ec22398-eab3-46af-8843-1c71a2f5db12" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Dec 04 10:34:27 crc kubenswrapper[4776]: I1204 10:34:27.707488 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85fb87d5bd-kzccs" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 04 10:34:28 crc kubenswrapper[4776]: I1204 10:34:28.678235 4776 generic.go:334] "Generic (PLEG): container finished" podID="7479329d-b468-4068-b9a7-3bf148b4a299" containerID="f579a5e78de540f59a32c794abc8fe627d2f20223870d2dfa6a19d403ad13c26" exitCode=0 Dec 04 10:34:28 crc kubenswrapper[4776]: I1204 10:34:28.678282 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-svl4r" event={"ID":"7479329d-b468-4068-b9a7-3bf148b4a299","Type":"ContainerDied","Data":"f579a5e78de540f59a32c794abc8fe627d2f20223870d2dfa6a19d403ad13c26"} Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.184502 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.286212 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxg4s\" (UniqueName: \"kubernetes.io/projected/7479329d-b468-4068-b9a7-3bf148b4a299-kube-api-access-vxg4s\") pod \"7479329d-b468-4068-b9a7-3bf148b4a299\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.286546 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-job-config-data\") pod \"7479329d-b468-4068-b9a7-3bf148b4a299\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.286580 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-combined-ca-bundle\") pod \"7479329d-b468-4068-b9a7-3bf148b4a299\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.286608 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-config-data\") pod \"7479329d-b468-4068-b9a7-3bf148b4a299\" (UID: \"7479329d-b468-4068-b9a7-3bf148b4a299\") " Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.292406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "7479329d-b468-4068-b9a7-3bf148b4a299" (UID: "7479329d-b468-4068-b9a7-3bf148b4a299"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.295492 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-config-data" (OuterVolumeSpecName: "config-data") pod "7479329d-b468-4068-b9a7-3bf148b4a299" (UID: "7479329d-b468-4068-b9a7-3bf148b4a299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.309077 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7479329d-b468-4068-b9a7-3bf148b4a299-kube-api-access-vxg4s" (OuterVolumeSpecName: "kube-api-access-vxg4s") pod "7479329d-b468-4068-b9a7-3bf148b4a299" (UID: "7479329d-b468-4068-b9a7-3bf148b4a299"). InnerVolumeSpecName "kube-api-access-vxg4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.316749 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7479329d-b468-4068-b9a7-3bf148b4a299" (UID: "7479329d-b468-4068-b9a7-3bf148b4a299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.390869 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxg4s\" (UniqueName: \"kubernetes.io/projected/7479329d-b468-4068-b9a7-3bf148b4a299-kube-api-access-vxg4s\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.390944 4776 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.390957 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.390968 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7479329d-b468-4068-b9a7-3bf148b4a299-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.699371 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-svl4r" event={"ID":"7479329d-b468-4068-b9a7-3bf148b4a299","Type":"ContainerDied","Data":"898ccb293165d58dc73a675ab980ff18c288ffc4dc8e627bc364efb48b0b770d"} Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.699411 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898ccb293165d58dc73a675ab980ff18c288ffc4dc8e627bc364efb48b0b770d" Dec 04 10:34:30 crc kubenswrapper[4776]: I1204 10:34:30.699464 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-svl4r" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.159197 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:31 crc kubenswrapper[4776]: E1204 10:34:31.163145 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7479329d-b468-4068-b9a7-3bf148b4a299" containerName="manila-db-sync" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.163172 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7479329d-b468-4068-b9a7-3bf148b4a299" containerName="manila-db-sync" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.163390 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7479329d-b468-4068-b9a7-3bf148b4a299" containerName="manila-db-sync" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.164474 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.172180 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.172567 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.172908 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.173652 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-sn44s" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.187967 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.219453 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.220173 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-scripts\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.220351 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsr94\" (UniqueName: \"kubernetes.io/projected/e5e262e7-ee7b-40f2-aa77-816faf806d8f-kube-api-access-qsr94\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.220517 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.220635 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5e262e7-ee7b-40f2-aa77-816faf806d8f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.220764 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.248248 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.251665 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.257308 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.266577 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.303050 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-slhwr"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.308211 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.322530 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-ceph\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.322589 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cfr2\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-kube-api-access-2cfr2\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.322617 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsr94\" (UniqueName: \"kubernetes.io/projected/e5e262e7-ee7b-40f2-aa77-816faf806d8f-kube-api-access-qsr94\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.322637 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.322904 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323065 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5e262e7-ee7b-40f2-aa77-816faf806d8f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323108 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323168 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323281 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323321 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-scripts\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323566 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323750 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323907 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.323983 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-scripts\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.326336 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5e262e7-ee7b-40f2-aa77-816faf806d8f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.331186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.331979 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-scripts\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.345267 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-slhwr"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.353869 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.369529 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsr94\" (UniqueName: \"kubernetes.io/projected/e5e262e7-ee7b-40f2-aa77-816faf806d8f-kube-api-access-qsr94\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.372889 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429495 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-ceph\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429554 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cfr2\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-kube-api-access-2cfr2\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429575 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429633 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429661 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-config\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429684 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429704 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429726 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4lf\" (UniqueName: \"kubernetes.io/projected/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-kube-api-access-cw4lf\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429746 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-scripts\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429779 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429812 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429845 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429864 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.429907 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.432416 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.434581 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.437792 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-scripts\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.439325 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.444186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-ceph\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.446482 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.447189 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.471155 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cfr2\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-kube-api-access-2cfr2\") pod \"manila-share-share1-0\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.499336 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.539299 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.539654 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.539893 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.539971 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-config\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.540061 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4lf\" (UniqueName: \"kubernetes.io/projected/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-kube-api-access-cw4lf\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.567690 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.569797 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.570480 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.573976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.577687 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-config\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.589226 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.608549 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4lf\" (UniqueName: \"kubernetes.io/projected/e6596bf3-fdc9-4ccf-b81a-3e5372bef33f-kube-api-access-cw4lf\") pod \"dnsmasq-dns-76b5fdb995-slhwr\" (UID: \"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f\") " pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.616247 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.622041 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.631508 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.656049 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.658656 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.783250 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjnr9\" (UniqueName: \"kubernetes.io/projected/2291aeda-955e-4c36-b28a-2a6697748dd2-kube-api-access-fjnr9\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.783726 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-scripts\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.783838 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2291aeda-955e-4c36-b28a-2a6697748dd2-etc-machine-id\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.783868 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data-custom\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.783902 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.784016 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.784079 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2291aeda-955e-4c36-b28a-2a6697748dd2-logs\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.868765 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.886177 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-scripts\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.886281 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2291aeda-955e-4c36-b28a-2a6697748dd2-etc-machine-id\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.886314 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data-custom\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.888583 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.888753 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.888843 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2291aeda-955e-4c36-b28a-2a6697748dd2-logs\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.888994 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjnr9\" (UniqueName: \"kubernetes.io/projected/2291aeda-955e-4c36-b28a-2a6697748dd2-kube-api-access-fjnr9\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.891995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2291aeda-955e-4c36-b28a-2a6697748dd2-etc-machine-id\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.895643 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2291aeda-955e-4c36-b28a-2a6697748dd2-logs\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.900692 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-scripts\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.901543 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.903891 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.905704 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data-custom\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.916665 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjnr9\" (UniqueName: \"kubernetes.io/projected/2291aeda-955e-4c36-b28a-2a6697748dd2-kube-api-access-fjnr9\") pod \"manila-api-0\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " pod="openstack/manila-api-0" Dec 04 10:34:31 crc kubenswrapper[4776]: I1204 10:34:31.963509 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.158578 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.452298 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.572235 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-slhwr"] Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.790134 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e5e262e7-ee7b-40f2-aa77-816faf806d8f","Type":"ContainerStarted","Data":"b5a4ac6705f21bec0f2a4e8bb9f9c94fce8dfa7d4e619721689d849b6986d8a3"} Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.804175 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" event={"ID":"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f","Type":"ContainerStarted","Data":"437868c72c8c955459c7346d3ef6b4032add7a0722241297cfdfbef651f0ba7b"} Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.819064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6a6029b7-6021-4e90-b568-d45da6047d62","Type":"ContainerStarted","Data":"179abc4f92d9843d4b8e63346ec969a18ab1886aeb14f58f34fc28f175b8c3dd"} Dec 04 10:34:32 crc kubenswrapper[4776]: I1204 10:34:32.907230 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:33 crc kubenswrapper[4776]: I1204 10:34:33.874288 4776 generic.go:334] "Generic (PLEG): container finished" podID="e6596bf3-fdc9-4ccf-b81a-3e5372bef33f" containerID="5f2b182bab457d42f8bf56fa95f25e5853c8a4c3a4f47425ed748470f7598d4a" exitCode=0 Dec 04 10:34:33 crc kubenswrapper[4776]: I1204 10:34:33.875008 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" event={"ID":"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f","Type":"ContainerDied","Data":"5f2b182bab457d42f8bf56fa95f25e5853c8a4c3a4f47425ed748470f7598d4a"} Dec 04 10:34:33 crc kubenswrapper[4776]: I1204 10:34:33.899626 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2291aeda-955e-4c36-b28a-2a6697748dd2","Type":"ContainerStarted","Data":"0bc7c4a509d4d508ae784441e4cd3812f3055cc8951a05058dbd85d63f7bf0be"} Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.936531 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.948495 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e5e262e7-ee7b-40f2-aa77-816faf806d8f","Type":"ContainerStarted","Data":"7bb0d921044d17920e62e47e0dedc4d0d602ad74d69c6d3a1fa41967a649bf6f"} Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.948561 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e5e262e7-ee7b-40f2-aa77-816faf806d8f","Type":"ContainerStarted","Data":"89b932b71a7a358814c204810eaeb1ba27be74502fec1b3793f36341435fcbe1"} Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.957690 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" event={"ID":"e6596bf3-fdc9-4ccf-b81a-3e5372bef33f","Type":"ContainerStarted","Data":"557d7f3130b360d44c2e471578b7d4f2b7f2f985df2980d5f105f03c0b91738b"} Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.957814 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.962420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2291aeda-955e-4c36-b28a-2a6697748dd2","Type":"ContainerStarted","Data":"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a"} Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.962466 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2291aeda-955e-4c36-b28a-2a6697748dd2","Type":"ContainerStarted","Data":"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15"} Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.962579 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 10:34:34 crc kubenswrapper[4776]: I1204 10:34:34.969631 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.16807105 podStartE2EDuration="3.969609071s" podCreationTimestamp="2025-12-04 10:34:31 +0000 UTC" firstStartedPulling="2025-12-04 10:34:32.219391782 +0000 UTC m=+3317.085872149" lastFinishedPulling="2025-12-04 10:34:33.020929793 +0000 UTC m=+3317.887410170" observedRunningTime="2025-12-04 10:34:34.967189705 +0000 UTC m=+3319.833670082" watchObservedRunningTime="2025-12-04 10:34:34.969609071 +0000 UTC m=+3319.836089448" Dec 04 10:34:35 crc kubenswrapper[4776]: I1204 10:34:35.002910 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" podStartSLOduration=4.002886248 podStartE2EDuration="4.002886248s" podCreationTimestamp="2025-12-04 10:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:34.994757932 +0000 UTC m=+3319.861238309" watchObservedRunningTime="2025-12-04 10:34:35.002886248 +0000 UTC m=+3319.869366625" Dec 04 10:34:35 crc kubenswrapper[4776]: I1204 10:34:35.025678 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.025655325 podStartE2EDuration="4.025655325s" podCreationTimestamp="2025-12-04 10:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:35.010635822 +0000 UTC m=+3319.877116199" watchObservedRunningTime="2025-12-04 10:34:35.025655325 +0000 UTC m=+3319.892135702" Dec 04 10:34:35 crc kubenswrapper[4776]: I1204 10:34:35.980412 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api-log" containerID="cri-o://f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15" gracePeriod=30 Dec 04 10:34:35 crc kubenswrapper[4776]: I1204 10:34:35.980461 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api" containerID="cri-o://184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a" gracePeriod=30 Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.645410 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.738734 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data-custom\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.738799 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.738979 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-scripts\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.739028 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjnr9\" (UniqueName: \"kubernetes.io/projected/2291aeda-955e-4c36-b28a-2a6697748dd2-kube-api-access-fjnr9\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.739113 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-combined-ca-bundle\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.739137 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2291aeda-955e-4c36-b28a-2a6697748dd2-logs\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.739266 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2291aeda-955e-4c36-b28a-2a6697748dd2-etc-machine-id\") pod \"2291aeda-955e-4c36-b28a-2a6697748dd2\" (UID: \"2291aeda-955e-4c36-b28a-2a6697748dd2\") " Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.739839 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2291aeda-955e-4c36-b28a-2a6697748dd2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.740531 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2291aeda-955e-4c36-b28a-2a6697748dd2-logs" (OuterVolumeSpecName: "logs") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.747975 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-scripts" (OuterVolumeSpecName: "scripts") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.751290 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2291aeda-955e-4c36-b28a-2a6697748dd2-kube-api-access-fjnr9" (OuterVolumeSpecName: "kube-api-access-fjnr9") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "kube-api-access-fjnr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.772782 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.793466 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.811139 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data" (OuterVolumeSpecName: "config-data") pod "2291aeda-955e-4c36-b28a-2a6697748dd2" (UID: "2291aeda-955e-4c36-b28a-2a6697748dd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841431 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841467 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2291aeda-955e-4c36-b28a-2a6697748dd2-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841479 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2291aeda-955e-4c36-b28a-2a6697748dd2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841493 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841505 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841531 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2291aeda-955e-4c36-b28a-2a6697748dd2-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.841543 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjnr9\" (UniqueName: \"kubernetes.io/projected/2291aeda-955e-4c36-b28a-2a6697748dd2-kube-api-access-fjnr9\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.990896 4776 generic.go:334] "Generic (PLEG): container finished" podID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerID="184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a" exitCode=0 Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.990950 4776 generic.go:334] "Generic (PLEG): container finished" podID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerID="f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15" exitCode=143 Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.991005 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.991836 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2291aeda-955e-4c36-b28a-2a6697748dd2","Type":"ContainerDied","Data":"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a"} Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.991992 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2291aeda-955e-4c36-b28a-2a6697748dd2","Type":"ContainerDied","Data":"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15"} Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.992047 4776 scope.go:117] "RemoveContainer" containerID="184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a" Dec 04 10:34:36 crc kubenswrapper[4776]: I1204 10:34:36.992064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"2291aeda-955e-4c36-b28a-2a6697748dd2","Type":"ContainerDied","Data":"0bc7c4a509d4d508ae784441e4cd3812f3055cc8951a05058dbd85d63f7bf0be"} Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.019119 4776 scope.go:117] "RemoveContainer" containerID="f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.041965 4776 scope.go:117] "RemoveContainer" containerID="184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a" Dec 04 10:34:37 crc kubenswrapper[4776]: E1204 10:34:37.042946 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a\": container with ID starting with 184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a not found: ID does not exist" containerID="184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.043043 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a"} err="failed to get container status \"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a\": rpc error: code = NotFound desc = could not find container \"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a\": container with ID starting with 184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a not found: ID does not exist" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.043096 4776 scope.go:117] "RemoveContainer" containerID="f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15" Dec 04 10:34:37 crc kubenswrapper[4776]: E1204 10:34:37.043669 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15\": container with ID starting with f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15 not found: ID does not exist" containerID="f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.043736 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15"} err="failed to get container status \"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15\": rpc error: code = NotFound desc = could not find container \"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15\": container with ID starting with f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15 not found: ID does not exist" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.043782 4776 scope.go:117] "RemoveContainer" containerID="184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.044307 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a"} err="failed to get container status \"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a\": rpc error: code = NotFound desc = could not find container \"184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a\": container with ID starting with 184e8f424e204f3eeb664a9bf027d8a1a4540fa9ad93fec5e7f16f8cd3ad3d2a not found: ID does not exist" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.044360 4776 scope.go:117] "RemoveContainer" containerID="f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.044691 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15"} err="failed to get container status \"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15\": rpc error: code = NotFound desc = could not find container \"f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15\": container with ID starting with f70d29ae9673fb6d44ca497b877c76906b581ac739c1933c5133a5c2a61b3f15 not found: ID does not exist" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.044852 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.057383 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.092393 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:37 crc kubenswrapper[4776]: E1204 10:34:37.092938 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api-log" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.092957 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api-log" Dec 04 10:34:37 crc kubenswrapper[4776]: E1204 10:34:37.092972 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.092979 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.093196 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.093222 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" containerName="manila-api-log" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.094254 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.100977 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.101207 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.101344 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.104060 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255160 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255209 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-public-tls-certs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255327 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-scripts\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255472 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjds\" (UniqueName: \"kubernetes.io/projected/aa8ba719-bab7-4330-97b3-1e1e35d20784-kube-api-access-ngjds\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255719 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-config-data-custom\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255832 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-config-data\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.255975 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.256019 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa8ba719-bab7-4330-97b3-1e1e35d20784-etc-machine-id\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.256290 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8ba719-bab7-4330-97b3-1e1e35d20784-logs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.357872 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.357940 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-public-tls-certs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.357971 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-scripts\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358034 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjds\" (UniqueName: \"kubernetes.io/projected/aa8ba719-bab7-4330-97b3-1e1e35d20784-kube-api-access-ngjds\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358101 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-config-data-custom\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358137 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-config-data\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358181 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358206 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa8ba719-bab7-4330-97b3-1e1e35d20784-etc-machine-id\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358291 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8ba719-bab7-4330-97b3-1e1e35d20784-logs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358584 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa8ba719-bab7-4330-97b3-1e1e35d20784-etc-machine-id\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.358728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8ba719-bab7-4330-97b3-1e1e35d20784-logs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.363701 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-config-data\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.363747 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-public-tls-certs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.363813 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.363948 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.364574 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-config-data-custom\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.373737 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa8ba719-bab7-4330-97b3-1e1e35d20784-scripts\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.377995 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjds\" (UniqueName: \"kubernetes.io/projected/aa8ba719-bab7-4330-97b3-1e1e35d20784-kube-api-access-ngjds\") pod \"manila-api-0\" (UID: \"aa8ba719-bab7-4330-97b3-1e1e35d20784\") " pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.422199 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 10:34:37 crc kubenswrapper[4776]: I1204 10:34:37.466485 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2291aeda-955e-4c36-b28a-2a6697748dd2" path="/var/lib/kubelet/pods/2291aeda-955e-4c36-b28a-2a6697748dd2/volumes" Dec 04 10:34:38 crc kubenswrapper[4776]: I1204 10:34:38.041948 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 10:34:39 crc kubenswrapper[4776]: I1204 10:34:39.804899 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:34:39 crc kubenswrapper[4776]: I1204 10:34:39.817539 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.064780 4776 generic.go:334] "Generic (PLEG): container finished" podID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerID="5753c1c176c7d6f3d943980b5f5fa32a9c7a77fcc6ce0db4988d1aca2dc06d72" exitCode=137 Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.064810 4776 generic.go:334] "Generic (PLEG): container finished" podID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerID="1c00d342d813650c9915c89faa42b262c2180b4711f1485ba2753ce19281843d" exitCode=137 Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.064845 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7847c6bcd5-qq274" event={"ID":"0576f02c-3bb8-4a18-a9b8-464e2bf22947","Type":"ContainerDied","Data":"5753c1c176c7d6f3d943980b5f5fa32a9c7a77fcc6ce0db4988d1aca2dc06d72"} Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.064871 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7847c6bcd5-qq274" event={"ID":"0576f02c-3bb8-4a18-a9b8-464e2bf22947","Type":"ContainerDied","Data":"1c00d342d813650c9915c89faa42b262c2180b4711f1485ba2753ce19281843d"} Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.066761 4776 generic.go:334] "Generic (PLEG): container finished" podID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerID="6fcf399226a4fc4ec52cfdefa1afcece04229235449e8bc74536213d5e17186f" exitCode=137 Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.066781 4776 generic.go:334] "Generic (PLEG): container finished" podID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerID="8d9c3caa3c5c107fd5342c9cf66fef4fa9f31b2fa76058e645052b20dc001970" exitCode=137 Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.066797 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8474fbc5b9-ntlst" event={"ID":"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8","Type":"ContainerDied","Data":"6fcf399226a4fc4ec52cfdefa1afcece04229235449e8bc74536213d5e17186f"} Dec 04 10:34:40 crc kubenswrapper[4776]: I1204 10:34:40.066812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8474fbc5b9-ntlst" event={"ID":"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8","Type":"ContainerDied","Data":"8d9c3caa3c5c107fd5342c9cf66fef4fa9f31b2fa76058e645052b20dc001970"} Dec 04 10:34:41 crc kubenswrapper[4776]: W1204 10:34:41.095971 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa8ba719_bab7_4330_97b3_1e1e35d20784.slice/crio-76a504d3f6a714599ee4daaf6a8d902c1ca1e1467b49e032ef89de4ccd4d2064 WatchSource:0}: Error finding container 76a504d3f6a714599ee4daaf6a8d902c1ca1e1467b49e032ef89de4ccd4d2064: Status 404 returned error can't find the container with id 76a504d3f6a714599ee4daaf6a8d902c1ca1e1467b49e032ef89de4ccd4d2064 Dec 04 10:34:41 crc kubenswrapper[4776]: I1204 10:34:41.500627 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 10:34:41 crc kubenswrapper[4776]: I1204 10:34:41.644126 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d787f787d-lqf8p" Dec 04 10:34:41 crc kubenswrapper[4776]: I1204 10:34:41.668708 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.730846 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85fb87d5bd-kzccs"] Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.799327 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.843925 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-scripts\") pod \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.844012 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv65\" (UniqueName: \"kubernetes.io/projected/0576f02c-3bb8-4a18-a9b8-464e2bf22947-kube-api-access-znv65\") pod \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.844056 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0576f02c-3bb8-4a18-a9b8-464e2bf22947-horizon-secret-key\") pod \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.844115 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-config-data\") pod \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.844134 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576f02c-3bb8-4a18-a9b8-464e2bf22947-logs\") pod \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\" (UID: \"0576f02c-3bb8-4a18-a9b8-464e2bf22947\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.845070 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0576f02c-3bb8-4a18-a9b8-464e2bf22947-logs" (OuterVolumeSpecName: "logs") pod "0576f02c-3bb8-4a18-a9b8-464e2bf22947" (UID: "0576f02c-3bb8-4a18-a9b8-464e2bf22947"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.850984 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0576f02c-3bb8-4a18-a9b8-464e2bf22947-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0576f02c-3bb8-4a18-a9b8-464e2bf22947" (UID: "0576f02c-3bb8-4a18-a9b8-464e2bf22947"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.851234 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0576f02c-3bb8-4a18-a9b8-464e2bf22947-kube-api-access-znv65" (OuterVolumeSpecName: "kube-api-access-znv65") pod "0576f02c-3bb8-4a18-a9b8-464e2bf22947" (UID: "0576f02c-3bb8-4a18-a9b8-464e2bf22947"). InnerVolumeSpecName "kube-api-access-znv65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.867553 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.871822 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-slhwr" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.918620 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-config-data" (OuterVolumeSpecName: "config-data") pod "0576f02c-3bb8-4a18-a9b8-464e2bf22947" (UID: "0576f02c-3bb8-4a18-a9b8-464e2bf22947"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.927605 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-scripts" (OuterVolumeSpecName: "scripts") pod "0576f02c-3bb8-4a18-a9b8-464e2bf22947" (UID: "0576f02c-3bb8-4a18-a9b8-464e2bf22947"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.950647 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.950696 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znv65\" (UniqueName: \"kubernetes.io/projected/0576f02c-3bb8-4a18-a9b8-464e2bf22947-kube-api-access-znv65\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.950707 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0576f02c-3bb8-4a18-a9b8-464e2bf22947-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.950717 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0576f02c-3bb8-4a18-a9b8-464e2bf22947-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.950728 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576f02c-3bb8-4a18-a9b8-464e2bf22947-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.981409 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-7tsd6"] Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:41.981628 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerName="dnsmasq-dns" containerID="cri-o://af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff" gracePeriod=10 Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.052688 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-config-data\") pod \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.052780 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhmbl\" (UniqueName: \"kubernetes.io/projected/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-kube-api-access-bhmbl\") pod \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.052818 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-horizon-secret-key\") pod \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.052852 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-scripts\") pod \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.052878 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-logs\") pod \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\" (UID: \"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8\") " Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.053282 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-logs" (OuterVolumeSpecName: "logs") pod "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" (UID: "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.053559 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.063825 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-kube-api-access-bhmbl" (OuterVolumeSpecName: "kube-api-access-bhmbl") pod "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" (UID: "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8"). InnerVolumeSpecName "kube-api-access-bhmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.081623 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" (UID: "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.082109 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-config-data" (OuterVolumeSpecName: "config-data") pod "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" (UID: "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.102654 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-scripts" (OuterVolumeSpecName: "scripts") pod "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" (UID: "694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.106565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7847c6bcd5-qq274" event={"ID":"0576f02c-3bb8-4a18-a9b8-464e2bf22947","Type":"ContainerDied","Data":"c0443deb32095c08686092562d576abd5f8612325b8ced4330621eb0517e70dd"} Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.106696 4776 scope.go:117] "RemoveContainer" containerID="5753c1c176c7d6f3d943980b5f5fa32a9c7a77fcc6ce0db4988d1aca2dc06d72" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.106891 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7847c6bcd5-qq274" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.116204 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8474fbc5b9-ntlst" event={"ID":"694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8","Type":"ContainerDied","Data":"231c192b63c81509c18864e26f95fdac0b76b801b68d21a738eab54185d4ff70"} Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.116327 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8474fbc5b9-ntlst" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.137228 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aa8ba719-bab7-4330-97b3-1e1e35d20784","Type":"ContainerStarted","Data":"bb35470999f4a3425f637f864a8c39d3b7e2e5599b561b8fd74394c93609c9a8"} Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.137293 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aa8ba719-bab7-4330-97b3-1e1e35d20784","Type":"ContainerStarted","Data":"76a504d3f6a714599ee4daaf6a8d902c1ca1e1467b49e032ef89de4ccd4d2064"} Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.137320 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85fb87d5bd-kzccs" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon-log" containerID="cri-o://09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1" gracePeriod=30 Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.137368 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85fb87d5bd-kzccs" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" containerID="cri-o://d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18" gracePeriod=30 Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.156478 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.156517 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhmbl\" (UniqueName: \"kubernetes.io/projected/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-kube-api-access-bhmbl\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.156532 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.156545 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.339580 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7847c6bcd5-qq274"] Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.349126 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7847c6bcd5-qq274"] Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.361088 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8474fbc5b9-ntlst"] Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.372667 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8474fbc5b9-ntlst"] Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.545106 4776 scope.go:117] "RemoveContainer" containerID="1c00d342d813650c9915c89faa42b262c2180b4711f1485ba2753ce19281843d" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.776412 4776 scope.go:117] "RemoveContainer" containerID="6fcf399226a4fc4ec52cfdefa1afcece04229235449e8bc74536213d5e17186f" Dec 04 10:34:42 crc kubenswrapper[4776]: I1204 10:34:42.875604 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.079982 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-openstack-edpm-ipam\") pod \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.080435 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-config\") pod \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.080527 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-nb\") pod \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.081263 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-sb\") pod \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.081412 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-dns-svc\") pod \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.081560 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wm4b\" (UniqueName: \"kubernetes.io/projected/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-kube-api-access-7wm4b\") pod \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\" (UID: \"a52c3d7a-7923-433a-b4ee-3ab8242fe04c\") " Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.086173 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-kube-api-access-7wm4b" (OuterVolumeSpecName: "kube-api-access-7wm4b") pod "a52c3d7a-7923-433a-b4ee-3ab8242fe04c" (UID: "a52c3d7a-7923-433a-b4ee-3ab8242fe04c"). InnerVolumeSpecName "kube-api-access-7wm4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.148525 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-config" (OuterVolumeSpecName: "config") pod "a52c3d7a-7923-433a-b4ee-3ab8242fe04c" (UID: "a52c3d7a-7923-433a-b4ee-3ab8242fe04c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.188962 4776 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.189002 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wm4b\" (UniqueName: \"kubernetes.io/projected/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-kube-api-access-7wm4b\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.210496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a52c3d7a-7923-433a-b4ee-3ab8242fe04c" (UID: "a52c3d7a-7923-433a-b4ee-3ab8242fe04c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.222094 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6a6029b7-6021-4e90-b568-d45da6047d62","Type":"ContainerStarted","Data":"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b"} Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.223076 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a52c3d7a-7923-433a-b4ee-3ab8242fe04c" (UID: "a52c3d7a-7923-433a-b4ee-3ab8242fe04c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.227617 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a52c3d7a-7923-433a-b4ee-3ab8242fe04c" (UID: "a52c3d7a-7923-433a-b4ee-3ab8242fe04c"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.236804 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a52c3d7a-7923-433a-b4ee-3ab8242fe04c" (UID: "a52c3d7a-7923-433a-b4ee-3ab8242fe04c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.248518 4776 scope.go:117] "RemoveContainer" containerID="8d9c3caa3c5c107fd5342c9cf66fef4fa9f31b2fa76058e645052b20dc001970" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.269392 4776 generic.go:334] "Generic (PLEG): container finished" podID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerID="af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff" exitCode=0 Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.269459 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" event={"ID":"a52c3d7a-7923-433a-b4ee-3ab8242fe04c","Type":"ContainerDied","Data":"af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff"} Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.269487 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" event={"ID":"a52c3d7a-7923-433a-b4ee-3ab8242fe04c","Type":"ContainerDied","Data":"81995df34ed68ed37a586e49f707d6ff5d53685fddf9d3aa79131fe04ab9783b"} Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.269576 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-7tsd6" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.296843 4776 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.296903 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.296944 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.296958 4776 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52c3d7a-7923-433a-b4ee-3ab8242fe04c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.308466 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aa8ba719-bab7-4330-97b3-1e1e35d20784","Type":"ContainerStarted","Data":"59ca4d9b3dbb98fd7b6bd4c4027a2474b4e8bcb5056dd2b1fab3e16c6918cd52"} Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.309876 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.336188 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.336170205 podStartE2EDuration="6.336170205s" podCreationTimestamp="2025-12-04 10:34:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:34:43.330016322 +0000 UTC m=+3328.196496699" watchObservedRunningTime="2025-12-04 10:34:43.336170205 +0000 UTC m=+3328.202650582" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.338063 4776 scope.go:117] "RemoveContainer" containerID="af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.353126 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-7tsd6"] Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.361469 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-7tsd6"] Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.369642 4776 scope.go:117] "RemoveContainer" containerID="9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.405593 4776 scope.go:117] "RemoveContainer" containerID="af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff" Dec 04 10:34:43 crc kubenswrapper[4776]: E1204 10:34:43.406693 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff\": container with ID starting with af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff not found: ID does not exist" containerID="af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.406745 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff"} err="failed to get container status \"af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff\": rpc error: code = NotFound desc = could not find container \"af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff\": container with ID starting with af12b809cba73a4c828c0679864e84da4147143aef6e15ea3257f59df45f08ff not found: ID does not exist" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.406776 4776 scope.go:117] "RemoveContainer" containerID="9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230" Dec 04 10:34:43 crc kubenswrapper[4776]: E1204 10:34:43.407233 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230\": container with ID starting with 9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230 not found: ID does not exist" containerID="9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.407277 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230"} err="failed to get container status \"9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230\": rpc error: code = NotFound desc = could not find container \"9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230\": container with ID starting with 9c42e8590ea98f2930d14b338a97ac0d12edd3ed165adeb51d8e8e4fbcc05230 not found: ID does not exist" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.467088 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" path="/var/lib/kubelet/pods/0576f02c-3bb8-4a18-a9b8-464e2bf22947/volumes" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.470849 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" path="/var/lib/kubelet/pods/694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8/volumes" Dec 04 10:34:43 crc kubenswrapper[4776]: I1204 10:34:43.472094 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" path="/var/lib/kubelet/pods/a52c3d7a-7923-433a-b4ee-3ab8242fe04c/volumes" Dec 04 10:34:44 crc kubenswrapper[4776]: I1204 10:34:44.329309 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6a6029b7-6021-4e90-b568-d45da6047d62","Type":"ContainerStarted","Data":"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914"} Dec 04 10:34:44 crc kubenswrapper[4776]: I1204 10:34:44.354386 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.387657915 podStartE2EDuration="13.354360034s" podCreationTimestamp="2025-12-04 10:34:31 +0000 UTC" firstStartedPulling="2025-12-04 10:34:32.491088461 +0000 UTC m=+3317.357568838" lastFinishedPulling="2025-12-04 10:34:41.45779058 +0000 UTC m=+3326.324270957" observedRunningTime="2025-12-04 10:34:44.351332849 +0000 UTC m=+3329.217813236" watchObservedRunningTime="2025-12-04 10:34:44.354360034 +0000 UTC m=+3329.220840411" Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.063940 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.064531 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-central-agent" containerID="cri-o://57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52" gracePeriod=30 Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.064607 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="proxy-httpd" containerID="cri-o://56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7" gracePeriod=30 Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.064631 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="sg-core" containerID="cri-o://a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917" gracePeriod=30 Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.064737 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-notification-agent" containerID="cri-o://707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9" gracePeriod=30 Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.346356 4776 generic.go:334] "Generic (PLEG): container finished" podID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerID="56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7" exitCode=0 Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.346392 4776 generic.go:334] "Generic (PLEG): container finished" podID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerID="a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917" exitCode=2 Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.346428 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerDied","Data":"56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7"} Dec 04 10:34:45 crc kubenswrapper[4776]: I1204 10:34:45.346473 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerDied","Data":"a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917"} Dec 04 10:34:46 crc kubenswrapper[4776]: I1204 10:34:46.357403 4776 generic.go:334] "Generic (PLEG): container finished" podID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerID="57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52" exitCode=0 Dec 04 10:34:46 crc kubenswrapper[4776]: I1204 10:34:46.357466 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerDied","Data":"57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52"} Dec 04 10:34:46 crc kubenswrapper[4776]: I1204 10:34:46.361753 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerID="d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18" exitCode=0 Dec 04 10:34:46 crc kubenswrapper[4776]: I1204 10:34:46.361798 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fb87d5bd-kzccs" event={"ID":"1b6d19b7-632e-4f82-8311-13e154f240f5","Type":"ContainerDied","Data":"d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18"} Dec 04 10:34:47 crc kubenswrapper[4776]: I1204 10:34:47.705632 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85fb87d5bd-kzccs" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.078025 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.140208 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-ceilometer-tls-certs\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.140504 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-config-data\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.140836 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-combined-ca-bundle\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.140938 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-log-httpd\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.141124 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-sg-core-conf-yaml\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.141213 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgqxp\" (UniqueName: \"kubernetes.io/projected/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-kube-api-access-tgqxp\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.141324 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-run-httpd\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.141399 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-scripts\") pod \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\" (UID: \"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493\") " Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.143705 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.145889 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.156048 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-scripts" (OuterVolumeSpecName: "scripts") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.193753 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-kube-api-access-tgqxp" (OuterVolumeSpecName: "kube-api-access-tgqxp") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "kube-api-access-tgqxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.200382 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.244455 4776 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.244630 4776 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.244683 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgqxp\" (UniqueName: \"kubernetes.io/projected/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-kube-api-access-tgqxp\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.244731 4776 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.244804 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.265672 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.279865 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.308783 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-config-data" (OuterVolumeSpecName: "config-data") pod "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" (UID: "0bf98e46-11f1-4fcf-b1b8-8bf79a36e493"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.347293 4776 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.347669 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.347783 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.411358 4776 generic.go:334] "Generic (PLEG): container finished" podID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerID="707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9" exitCode=0 Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.411572 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerDied","Data":"707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9"} Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.411660 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0bf98e46-11f1-4fcf-b1b8-8bf79a36e493","Type":"ContainerDied","Data":"072f3b97f95a0260a2f6ea5d0b63d7f7959fdb0db3bff5992540046614d40e68"} Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.411683 4776 scope.go:117] "RemoveContainer" containerID="56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.411817 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.435950 4776 scope.go:117] "RemoveContainer" containerID="a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.467044 4776 scope.go:117] "RemoveContainer" containerID="707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.468195 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.478460 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.487904 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.488576 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.488657 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.488749 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.488799 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.488849 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-notification-agent" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.488894 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-notification-agent" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.488964 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon-log" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.489012 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon-log" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.489073 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerName="init" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.489128 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerName="init" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.489215 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="sg-core" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.489272 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="sg-core" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.489333 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="proxy-httpd" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.489662 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="proxy-httpd" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.489728 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon-log" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.489783 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon-log" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.489842 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-central-agent" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.489887 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-central-agent" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.490253 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerName="dnsmasq-dns" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490307 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerName="dnsmasq-dns" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490627 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490706 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490766 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52c3d7a-7923-433a-b4ee-3ab8242fe04c" containerName="dnsmasq-dns" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490816 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-central-agent" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490873 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0576f02c-3bb8-4a18-a9b8-464e2bf22947" containerName="horizon-log" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490941 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="proxy-httpd" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.490995 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="ceilometer-notification-agent" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.491054 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="694c6084-0a3b-4f76-bcb0-4cf26a9e3fa8" containerName="horizon-log" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.491106 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" containerName="sg-core" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.493702 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.498952 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.502781 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.503196 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.503489 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.517498 4776 scope.go:117] "RemoveContainer" containerID="57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.544970 4776 scope.go:117] "RemoveContainer" containerID="56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.546016 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7\": container with ID starting with 56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7 not found: ID does not exist" containerID="56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.546060 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7"} err="failed to get container status \"56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7\": rpc error: code = NotFound desc = could not find container \"56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7\": container with ID starting with 56a1c1505ce2a8267cabdc361250ab25e09e03d00b68ef4c7155a7dcdf3311b7 not found: ID does not exist" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.546089 4776 scope.go:117] "RemoveContainer" containerID="a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.546392 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917\": container with ID starting with a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917 not found: ID does not exist" containerID="a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.546415 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917"} err="failed to get container status \"a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917\": rpc error: code = NotFound desc = could not find container \"a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917\": container with ID starting with a9d0b87cf6d5de710e126a0cfea117faacc95e39f3ee7b40a8af3368ec61d917 not found: ID does not exist" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.546428 4776 scope.go:117] "RemoveContainer" containerID="707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.546792 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9\": container with ID starting with 707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9 not found: ID does not exist" containerID="707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.546822 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9"} err="failed to get container status \"707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9\": rpc error: code = NotFound desc = could not find container \"707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9\": container with ID starting with 707951a4e45b874626c60aea2f222d5647d91a26dfa80b3aa7192d8f046661f9 not found: ID does not exist" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.546838 4776 scope.go:117] "RemoveContainer" containerID="57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52" Dec 04 10:34:50 crc kubenswrapper[4776]: E1204 10:34:50.547078 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52\": container with ID starting with 57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52 not found: ID does not exist" containerID="57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.547098 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52"} err="failed to get container status \"57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52\": rpc error: code = NotFound desc = could not find container \"57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52\": container with ID starting with 57e5f2d8dfe1340f1bbf4d82a772f527b4ae3c223287b5f83c3c93b50fbfbe52 not found: ID does not exist" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551437 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551488 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n64tk\" (UniqueName: \"kubernetes.io/projected/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-kube-api-access-n64tk\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551525 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-log-httpd\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551620 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-run-httpd\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551670 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551748 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-scripts\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551836 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-config-data\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.551874 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653320 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-config-data\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653383 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653468 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653490 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n64tk\" (UniqueName: \"kubernetes.io/projected/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-kube-api-access-n64tk\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653516 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-log-httpd\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653574 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-run-httpd\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653606 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.653656 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-scripts\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.654295 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-log-httpd\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.654362 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-run-httpd\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.659458 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-config-data\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.662613 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.664859 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.665193 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-scripts\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.670626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.677628 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n64tk\" (UniqueName: \"kubernetes.io/projected/af73cf01-ace5-4bc7-a209-2f9eb86cb7d6-kube-api-access-n64tk\") pod \"ceilometer-0\" (UID: \"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6\") " pod="openstack/ceilometer-0" Dec 04 10:34:50 crc kubenswrapper[4776]: I1204 10:34:50.817987 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 10:34:51 crc kubenswrapper[4776]: I1204 10:34:51.304317 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 10:34:51 crc kubenswrapper[4776]: I1204 10:34:51.423968 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6","Type":"ContainerStarted","Data":"8f7ca706b7cd12433a4f4de256a6b279bd3df37aa7ee9dd7e78cdceebcdfaf52"} Dec 04 10:34:51 crc kubenswrapper[4776]: I1204 10:34:51.470965 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf98e46-11f1-4fcf-b1b8-8bf79a36e493" path="/var/lib/kubelet/pods/0bf98e46-11f1-4fcf-b1b8-8bf79a36e493/volumes" Dec 04 10:34:51 crc kubenswrapper[4776]: I1204 10:34:51.656862 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 10:34:52 crc kubenswrapper[4776]: I1204 10:34:52.434607 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6","Type":"ContainerStarted","Data":"67f9df50f184dff73ebfe7c5a02c6c801c35c4fe617087898a40e94f40c125c7"} Dec 04 10:34:53 crc kubenswrapper[4776]: I1204 10:34:53.382541 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 10:34:53 crc kubenswrapper[4776]: I1204 10:34:53.448890 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6","Type":"ContainerStarted","Data":"964a9608f8f15ddb9daf90cd7991f96df7650eba9375f6ca537e0e26e2b82deb"} Dec 04 10:34:53 crc kubenswrapper[4776]: I1204 10:34:53.464611 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:53 crc kubenswrapper[4776]: I1204 10:34:53.464856 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="manila-scheduler" containerID="cri-o://89b932b71a7a358814c204810eaeb1ba27be74502fec1b3793f36341435fcbe1" gracePeriod=30 Dec 04 10:34:53 crc kubenswrapper[4776]: I1204 10:34:53.465009 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="probe" containerID="cri-o://7bb0d921044d17920e62e47e0dedc4d0d602ad74d69c6d3a1fa41967a649bf6f" gracePeriod=30 Dec 04 10:34:54 crc kubenswrapper[4776]: I1204 10:34:54.466244 4776 generic.go:334] "Generic (PLEG): container finished" podID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerID="7bb0d921044d17920e62e47e0dedc4d0d602ad74d69c6d3a1fa41967a649bf6f" exitCode=0 Dec 04 10:34:54 crc kubenswrapper[4776]: I1204 10:34:54.466611 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e5e262e7-ee7b-40f2-aa77-816faf806d8f","Type":"ContainerDied","Data":"7bb0d921044d17920e62e47e0dedc4d0d602ad74d69c6d3a1fa41967a649bf6f"} Dec 04 10:34:54 crc kubenswrapper[4776]: I1204 10:34:54.475489 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6","Type":"ContainerStarted","Data":"d35d82a9965212e68789f7ec538f71d1485a32efc32bd8195b4b7c48838d658c"} Dec 04 10:34:55 crc kubenswrapper[4776]: I1204 10:34:55.498815 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af73cf01-ace5-4bc7-a209-2f9eb86cb7d6","Type":"ContainerStarted","Data":"85de30ae48c764309c68e26e922db5ed3c8efec894a994b43b2cc3f48cf60fa5"} Dec 04 10:34:55 crc kubenswrapper[4776]: I1204 10:34:55.499513 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 10:34:55 crc kubenswrapper[4776]: I1204 10:34:55.532900 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.386172275 podStartE2EDuration="5.5328761s" podCreationTimestamp="2025-12-04 10:34:50 +0000 UTC" firstStartedPulling="2025-12-04 10:34:51.303823797 +0000 UTC m=+3336.170304174" lastFinishedPulling="2025-12-04 10:34:54.450527622 +0000 UTC m=+3339.317007999" observedRunningTime="2025-12-04 10:34:55.528001506 +0000 UTC m=+3340.394481893" watchObservedRunningTime="2025-12-04 10:34:55.5328761 +0000 UTC m=+3340.399356477" Dec 04 10:34:57 crc kubenswrapper[4776]: I1204 10:34:57.705603 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85fb87d5bd-kzccs" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 04 10:34:58 crc kubenswrapper[4776]: I1204 10:34:58.527146 4776 generic.go:334] "Generic (PLEG): container finished" podID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerID="89b932b71a7a358814c204810eaeb1ba27be74502fec1b3793f36341435fcbe1" exitCode=0 Dec 04 10:34:58 crc kubenswrapper[4776]: I1204 10:34:58.527199 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e5e262e7-ee7b-40f2-aa77-816faf806d8f","Type":"ContainerDied","Data":"89b932b71a7a358814c204810eaeb1ba27be74502fec1b3793f36341435fcbe1"} Dec 04 10:34:58 crc kubenswrapper[4776]: I1204 10:34:58.826434 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:34:58 crc kubenswrapper[4776]: I1204 10:34:58.988037 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011214 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5e262e7-ee7b-40f2-aa77-816faf806d8f-etc-machine-id\") pod \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011322 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data\") pod \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011343 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5e262e7-ee7b-40f2-aa77-816faf806d8f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e5e262e7-ee7b-40f2-aa77-816faf806d8f" (UID: "e5e262e7-ee7b-40f2-aa77-816faf806d8f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011376 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data-custom\") pod \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011430 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-combined-ca-bundle\") pod \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011479 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsr94\" (UniqueName: \"kubernetes.io/projected/e5e262e7-ee7b-40f2-aa77-816faf806d8f-kube-api-access-qsr94\") pod \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011507 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-scripts\") pod \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\" (UID: \"e5e262e7-ee7b-40f2-aa77-816faf806d8f\") " Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.011946 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5e262e7-ee7b-40f2-aa77-816faf806d8f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.035762 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-scripts" (OuterVolumeSpecName: "scripts") pod "e5e262e7-ee7b-40f2-aa77-816faf806d8f" (UID: "e5e262e7-ee7b-40f2-aa77-816faf806d8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.035816 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e262e7-ee7b-40f2-aa77-816faf806d8f-kube-api-access-qsr94" (OuterVolumeSpecName: "kube-api-access-qsr94") pod "e5e262e7-ee7b-40f2-aa77-816faf806d8f" (UID: "e5e262e7-ee7b-40f2-aa77-816faf806d8f"). InnerVolumeSpecName "kube-api-access-qsr94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.035858 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5e262e7-ee7b-40f2-aa77-816faf806d8f" (UID: "e5e262e7-ee7b-40f2-aa77-816faf806d8f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.077185 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5e262e7-ee7b-40f2-aa77-816faf806d8f" (UID: "e5e262e7-ee7b-40f2-aa77-816faf806d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.114049 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.114088 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.114101 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsr94\" (UniqueName: \"kubernetes.io/projected/e5e262e7-ee7b-40f2-aa77-816faf806d8f-kube-api-access-qsr94\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.114114 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.172901 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data" (OuterVolumeSpecName: "config-data") pod "e5e262e7-ee7b-40f2-aa77-816faf806d8f" (UID: "e5e262e7-ee7b-40f2-aa77-816faf806d8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.216300 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e262e7-ee7b-40f2-aa77-816faf806d8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.538569 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e5e262e7-ee7b-40f2-aa77-816faf806d8f","Type":"ContainerDied","Data":"b5a4ac6705f21bec0f2a4e8bb9f9c94fce8dfa7d4e619721689d849b6986d8a3"} Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.538796 4776 scope.go:117] "RemoveContainer" containerID="7bb0d921044d17920e62e47e0dedc4d0d602ad74d69c6d3a1fa41967a649bf6f" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.538647 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.568356 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.594259 4776 scope.go:117] "RemoveContainer" containerID="89b932b71a7a358814c204810eaeb1ba27be74502fec1b3793f36341435fcbe1" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.632111 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.641084 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:59 crc kubenswrapper[4776]: E1204 10:34:59.642062 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="manila-scheduler" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.642101 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="manila-scheduler" Dec 04 10:34:59 crc kubenswrapper[4776]: E1204 10:34:59.642132 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="probe" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.642140 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="probe" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.642407 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="manila-scheduler" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.642438 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" containerName="probe" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.644197 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.646979 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.650189 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.730401 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.730466 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwsg\" (UniqueName: \"kubernetes.io/projected/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-kube-api-access-5pwsg\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.730507 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.730530 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.730552 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-scripts\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.730596 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-config-data\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832581 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-config-data\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832785 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwsg\" (UniqueName: \"kubernetes.io/projected/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-kube-api-access-5pwsg\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832850 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832879 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832905 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-scripts\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.832970 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.838231 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.838982 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-config-data\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.839132 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-scripts\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.841430 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.867305 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwsg\" (UniqueName: \"kubernetes.io/projected/2c756ff2-9b1e-42d0-97ca-e173b0de24d5-kube-api-access-5pwsg\") pod \"manila-scheduler-0\" (UID: \"2c756ff2-9b1e-42d0-97ca-e173b0de24d5\") " pod="openstack/manila-scheduler-0" Dec 04 10:34:59 crc kubenswrapper[4776]: I1204 10:34:59.980501 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 10:35:00 crc kubenswrapper[4776]: I1204 10:35:00.479806 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 10:35:00 crc kubenswrapper[4776]: I1204 10:35:00.553512 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"2c756ff2-9b1e-42d0-97ca-e173b0de24d5","Type":"ContainerStarted","Data":"5c2595c0562feb645625178fe981cdfccaef3f09ccdfa8049d2a9b52d90cb55f"} Dec 04 10:35:01 crc kubenswrapper[4776]: I1204 10:35:01.462983 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e262e7-ee7b-40f2-aa77-816faf806d8f" path="/var/lib/kubelet/pods/e5e262e7-ee7b-40f2-aa77-816faf806d8f/volumes" Dec 04 10:35:01 crc kubenswrapper[4776]: I1204 10:35:01.567746 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"2c756ff2-9b1e-42d0-97ca-e173b0de24d5","Type":"ContainerStarted","Data":"3383b1c6357c11661b3ab26e5a9b81480e6f12c7b38e10e8d5799017f63ac94f"} Dec 04 10:35:01 crc kubenswrapper[4776]: I1204 10:35:01.567796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"2c756ff2-9b1e-42d0-97ca-e173b0de24d5","Type":"ContainerStarted","Data":"3bdc3e7695821054fa1e9f757715c1cdc8e751ff4709cf8b1582f5866443f324"} Dec 04 10:35:03 crc kubenswrapper[4776]: I1204 10:35:03.431426 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 10:35:03 crc kubenswrapper[4776]: I1204 10:35:03.457281 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.45725802 podStartE2EDuration="4.45725802s" podCreationTimestamp="2025-12-04 10:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:01.590305394 +0000 UTC m=+3346.456785791" watchObservedRunningTime="2025-12-04 10:35:03.45725802 +0000 UTC m=+3348.323738397" Dec 04 10:35:03 crc kubenswrapper[4776]: I1204 10:35:03.505287 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:35:03 crc kubenswrapper[4776]: I1204 10:35:03.589308 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="manila-share" containerID="cri-o://85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b" gracePeriod=30 Dec 04 10:35:03 crc kubenswrapper[4776]: I1204 10:35:03.589644 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="probe" containerID="cri-o://cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914" gracePeriod=30 Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.594532 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603804 4776 generic.go:334] "Generic (PLEG): container finished" podID="6a6029b7-6021-4e90-b568-d45da6047d62" containerID="cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914" exitCode=0 Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603840 4776 generic.go:334] "Generic (PLEG): container finished" podID="6a6029b7-6021-4e90-b568-d45da6047d62" containerID="85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b" exitCode=1 Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603864 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6a6029b7-6021-4e90-b568-d45da6047d62","Type":"ContainerDied","Data":"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914"} Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603930 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6a6029b7-6021-4e90-b568-d45da6047d62","Type":"ContainerDied","Data":"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b"} Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603944 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603964 4776 scope.go:117] "RemoveContainer" containerID="cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.603948 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6a6029b7-6021-4e90-b568-d45da6047d62","Type":"ContainerDied","Data":"179abc4f92d9843d4b8e63346ec969a18ab1886aeb14f58f34fc28f175b8c3dd"} Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.631100 4776 scope.go:117] "RemoveContainer" containerID="85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.703301 4776 scope.go:117] "RemoveContainer" containerID="cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914" Dec 04 10:35:04 crc kubenswrapper[4776]: E1204 10:35:04.703929 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914\": container with ID starting with cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914 not found: ID does not exist" containerID="cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.703965 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914"} err="failed to get container status \"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914\": rpc error: code = NotFound desc = could not find container \"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914\": container with ID starting with cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914 not found: ID does not exist" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.703990 4776 scope.go:117] "RemoveContainer" containerID="85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b" Dec 04 10:35:04 crc kubenswrapper[4776]: E1204 10:35:04.704469 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b\": container with ID starting with 85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b not found: ID does not exist" containerID="85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.704501 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b"} err="failed to get container status \"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b\": rpc error: code = NotFound desc = could not find container \"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b\": container with ID starting with 85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b not found: ID does not exist" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.704520 4776 scope.go:117] "RemoveContainer" containerID="cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.704811 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914"} err="failed to get container status \"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914\": rpc error: code = NotFound desc = could not find container \"cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914\": container with ID starting with cb80fbf7c607bd2f260e16c3edad08d01e54b9e11f4b0fca1b1341e49988b914 not found: ID does not exist" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.704833 4776 scope.go:117] "RemoveContainer" containerID="85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.705183 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b"} err="failed to get container status \"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b\": rpc error: code = NotFound desc = could not find container \"85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b\": container with ID starting with 85606466dadec182950b3475e96906eea999cb2c5a041a444ee7dc01405a1f6b not found: ID does not exist" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.729658 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-combined-ca-bundle\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.729811 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data-custom\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.729865 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.729883 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-scripts\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.729943 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cfr2\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-kube-api-access-2cfr2\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.729969 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-ceph\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.730022 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-etc-machine-id\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.730064 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-var-lib-manila\") pod \"6a6029b7-6021-4e90-b568-d45da6047d62\" (UID: \"6a6029b7-6021-4e90-b568-d45da6047d62\") " Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.730543 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.735289 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.743021 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-scripts" (OuterVolumeSpecName: "scripts") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.743065 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-ceph" (OuterVolumeSpecName: "ceph") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.743499 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.744100 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-kube-api-access-2cfr2" (OuterVolumeSpecName: "kube-api-access-2cfr2") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "kube-api-access-2cfr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.810089 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832569 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cfr2\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-kube-api-access-2cfr2\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832601 4776 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6a6029b7-6021-4e90-b568-d45da6047d62-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832612 4776 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832620 4776 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6a6029b7-6021-4e90-b568-d45da6047d62-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832627 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832636 4776 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.832643 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.837600 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data" (OuterVolumeSpecName: "config-data") pod "6a6029b7-6021-4e90-b568-d45da6047d62" (UID: "6a6029b7-6021-4e90-b568-d45da6047d62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.935312 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6029b7-6021-4e90-b568-d45da6047d62-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.945489 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.957174 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.968469 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:35:04 crc kubenswrapper[4776]: E1204 10:35:04.969560 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="probe" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.969624 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="probe" Dec 04 10:35:04 crc kubenswrapper[4776]: E1204 10:35:04.969677 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="manila-share" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.969725 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="manila-share" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.969970 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="probe" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.970046 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" containerName="manila-share" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.971301 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.978430 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 10:35:04 crc kubenswrapper[4776]: I1204 10:35:04.992297 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.142902 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.142996 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.143055 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13270572-bb8d-45b4-aa78-156fc1b09a73-ceph\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.143147 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13270572-bb8d-45b4-aa78-156fc1b09a73-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.143190 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/13270572-bb8d-45b4-aa78-156fc1b09a73-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.143241 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-scripts\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.143262 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-config-data\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.143311 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52sz\" (UniqueName: \"kubernetes.io/projected/13270572-bb8d-45b4-aa78-156fc1b09a73-kube-api-access-n52sz\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.244775 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52sz\" (UniqueName: \"kubernetes.io/projected/13270572-bb8d-45b4-aa78-156fc1b09a73-kube-api-access-n52sz\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.244874 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.244935 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.244981 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13270572-bb8d-45b4-aa78-156fc1b09a73-ceph\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.245025 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13270572-bb8d-45b4-aa78-156fc1b09a73-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.245050 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/13270572-bb8d-45b4-aa78-156fc1b09a73-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.245102 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-scripts\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.245133 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-config-data\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.246100 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/13270572-bb8d-45b4-aa78-156fc1b09a73-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.246233 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13270572-bb8d-45b4-aa78-156fc1b09a73-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.249396 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-scripts\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.249574 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.249823 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13270572-bb8d-45b4-aa78-156fc1b09a73-ceph\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.250090 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-config-data\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.250722 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13270572-bb8d-45b4-aa78-156fc1b09a73-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.265655 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52sz\" (UniqueName: \"kubernetes.io/projected/13270572-bb8d-45b4-aa78-156fc1b09a73-kube-api-access-n52sz\") pod \"manila-share-share1-0\" (UID: \"13270572-bb8d-45b4-aa78-156fc1b09a73\") " pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.298798 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.469930 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6029b7-6021-4e90-b568-d45da6047d62" path="/var/lib/kubelet/pods/6a6029b7-6021-4e90-b568-d45da6047d62/volumes" Dec 04 10:35:05 crc kubenswrapper[4776]: I1204 10:35:05.923457 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 10:35:05 crc kubenswrapper[4776]: W1204 10:35:05.925319 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13270572_bb8d_45b4_aa78_156fc1b09a73.slice/crio-02f5bb159b0cdd763f12c725909ffb2fbd62901f16e6e7c72f4a24bb43df3cde WatchSource:0}: Error finding container 02f5bb159b0cdd763f12c725909ffb2fbd62901f16e6e7c72f4a24bb43df3cde: Status 404 returned error can't find the container with id 02f5bb159b0cdd763f12c725909ffb2fbd62901f16e6e7c72f4a24bb43df3cde Dec 04 10:35:06 crc kubenswrapper[4776]: I1204 10:35:06.626412 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"13270572-bb8d-45b4-aa78-156fc1b09a73","Type":"ContainerStarted","Data":"4578e5351848df047023866e165b5f22ce56bc437dfb9301066850c3c6b6a22f"} Dec 04 10:35:06 crc kubenswrapper[4776]: I1204 10:35:06.626739 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"13270572-bb8d-45b4-aa78-156fc1b09a73","Type":"ContainerStarted","Data":"02f5bb159b0cdd763f12c725909ffb2fbd62901f16e6e7c72f4a24bb43df3cde"} Dec 04 10:35:07 crc kubenswrapper[4776]: I1204 10:35:07.636942 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"13270572-bb8d-45b4-aa78-156fc1b09a73","Type":"ContainerStarted","Data":"93923ff18f53d9a103d2fe2e59e41216af1a57154554dbe4e77c1ee3b4e49593"} Dec 04 10:35:07 crc kubenswrapper[4776]: I1204 10:35:07.659796 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.659773228 podStartE2EDuration="3.659773228s" podCreationTimestamp="2025-12-04 10:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:35:07.656212856 +0000 UTC m=+3352.522693233" watchObservedRunningTime="2025-12-04 10:35:07.659773228 +0000 UTC m=+3352.526253605" Dec 04 10:35:07 crc kubenswrapper[4776]: I1204 10:35:07.705141 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-85fb87d5bd-kzccs" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 04 10:35:07 crc kubenswrapper[4776]: I1204 10:35:07.705283 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:35:09 crc kubenswrapper[4776]: I1204 10:35:09.980640 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.681230 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.696059 4776 generic.go:334] "Generic (PLEG): container finished" podID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerID="09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1" exitCode=137 Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.696100 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fb87d5bd-kzccs" event={"ID":"1b6d19b7-632e-4f82-8311-13e154f240f5","Type":"ContainerDied","Data":"09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1"} Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.696126 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fb87d5bd-kzccs" event={"ID":"1b6d19b7-632e-4f82-8311-13e154f240f5","Type":"ContainerDied","Data":"54ae4b003648727847f56a649a69e47d0ac7c6654be0557f0853ffdd0f8d877c"} Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.696145 4776 scope.go:117] "RemoveContainer" containerID="d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.696311 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fb87d5bd-kzccs" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816108 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b6d19b7-632e-4f82-8311-13e154f240f5-logs\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816391 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-combined-ca-bundle\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816465 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-config-data\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816501 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjr5v\" (UniqueName: \"kubernetes.io/projected/1b6d19b7-632e-4f82-8311-13e154f240f5-kube-api-access-zjr5v\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816530 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-scripts\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816609 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-secret-key\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.816642 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-tls-certs\") pod \"1b6d19b7-632e-4f82-8311-13e154f240f5\" (UID: \"1b6d19b7-632e-4f82-8311-13e154f240f5\") " Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.818443 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6d19b7-632e-4f82-8311-13e154f240f5-logs" (OuterVolumeSpecName: "logs") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.822058 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.822743 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6d19b7-632e-4f82-8311-13e154f240f5-kube-api-access-zjr5v" (OuterVolumeSpecName: "kube-api-access-zjr5v") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "kube-api-access-zjr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.845540 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-config-data" (OuterVolumeSpecName: "config-data") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.851951 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-scripts" (OuterVolumeSpecName: "scripts") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.862176 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.868976 4776 scope.go:117] "RemoveContainer" containerID="09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.876256 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1b6d19b7-632e-4f82-8311-13e154f240f5" (UID: "1b6d19b7-632e-4f82-8311-13e154f240f5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920519 4776 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b6d19b7-632e-4f82-8311-13e154f240f5-logs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920551 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920560 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920573 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjr5v\" (UniqueName: \"kubernetes.io/projected/1b6d19b7-632e-4f82-8311-13e154f240f5-kube-api-access-zjr5v\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920583 4776 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b6d19b7-632e-4f82-8311-13e154f240f5-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920593 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.920601 4776 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b6d19b7-632e-4f82-8311-13e154f240f5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.939798 4776 scope.go:117] "RemoveContainer" containerID="d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18" Dec 04 10:35:12 crc kubenswrapper[4776]: E1204 10:35:12.940394 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18\": container with ID starting with d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18 not found: ID does not exist" containerID="d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.940440 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18"} err="failed to get container status \"d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18\": rpc error: code = NotFound desc = could not find container \"d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18\": container with ID starting with d9c7f5540f9aba3ffd4650dc15da0647bfc66e2316977aa7e31e6dbeb2d7be18 not found: ID does not exist" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.940469 4776 scope.go:117] "RemoveContainer" containerID="09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1" Dec 04 10:35:12 crc kubenswrapper[4776]: E1204 10:35:12.940764 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1\": container with ID starting with 09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1 not found: ID does not exist" containerID="09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1" Dec 04 10:35:12 crc kubenswrapper[4776]: I1204 10:35:12.940816 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1"} err="failed to get container status \"09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1\": rpc error: code = NotFound desc = could not find container \"09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1\": container with ID starting with 09929504e399f57f507f7990d612811343de9580614402c5c5af40102136b1d1 not found: ID does not exist" Dec 04 10:35:13 crc kubenswrapper[4776]: I1204 10:35:13.029231 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85fb87d5bd-kzccs"] Dec 04 10:35:13 crc kubenswrapper[4776]: I1204 10:35:13.036667 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85fb87d5bd-kzccs"] Dec 04 10:35:13 crc kubenswrapper[4776]: I1204 10:35:13.465279 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" path="/var/lib/kubelet/pods/1b6d19b7-632e-4f82-8311-13e154f240f5/volumes" Dec 04 10:35:15 crc kubenswrapper[4776]: I1204 10:35:15.299412 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 10:35:19 crc kubenswrapper[4776]: I1204 10:35:19.379560 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:35:19 crc kubenswrapper[4776]: I1204 10:35:19.380147 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:35:20 crc kubenswrapper[4776]: I1204 10:35:20.829249 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 10:35:21 crc kubenswrapper[4776]: I1204 10:35:21.556030 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 10:35:26 crc kubenswrapper[4776]: I1204 10:35:26.788624 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 10:35:49 crc kubenswrapper[4776]: I1204 10:35:49.380066 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:35:49 crc kubenswrapper[4776]: I1204 10:35:49.380692 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.560616 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g7ds7"] Dec 04 10:36:06 crc kubenswrapper[4776]: E1204 10:36:06.562520 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon-log" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.562555 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon-log" Dec 04 10:36:06 crc kubenswrapper[4776]: E1204 10:36:06.562575 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.562584 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.562789 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon-log" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.562808 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6d19b7-632e-4f82-8311-13e154f240f5" containerName="horizon" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.564155 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.577561 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7ds7"] Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.716155 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-catalog-content\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.717089 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k5h9\" (UniqueName: \"kubernetes.io/projected/7556826b-1a16-4285-918b-1df6afb2bfad-kube-api-access-5k5h9\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.717179 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-utilities\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.818661 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-catalog-content\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.818772 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k5h9\" (UniqueName: \"kubernetes.io/projected/7556826b-1a16-4285-918b-1df6afb2bfad-kube-api-access-5k5h9\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.818802 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-utilities\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.819273 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-utilities\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.819275 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-catalog-content\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.838032 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k5h9\" (UniqueName: \"kubernetes.io/projected/7556826b-1a16-4285-918b-1df6afb2bfad-kube-api-access-5k5h9\") pod \"community-operators-g7ds7\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:06 crc kubenswrapper[4776]: I1204 10:36:06.893307 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:07 crc kubenswrapper[4776]: I1204 10:36:07.485641 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7ds7"] Dec 04 10:36:08 crc kubenswrapper[4776]: I1204 10:36:08.265116 4776 generic.go:334] "Generic (PLEG): container finished" podID="7556826b-1a16-4285-918b-1df6afb2bfad" containerID="497c9316178c9fdc24d67e99233e1853dcde957070fad41f005c26bee979d09d" exitCode=0 Dec 04 10:36:08 crc kubenswrapper[4776]: I1204 10:36:08.265520 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerDied","Data":"497c9316178c9fdc24d67e99233e1853dcde957070fad41f005c26bee979d09d"} Dec 04 10:36:08 crc kubenswrapper[4776]: I1204 10:36:08.265556 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerStarted","Data":"4e24ed89751e324173e13e6bc19cce4c3dbd380965c5907557607e45c362a101"} Dec 04 10:36:09 crc kubenswrapper[4776]: I1204 10:36:09.276066 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerStarted","Data":"f6018ca2659a65e13be45a5e1ab600ad799b893255362a7b41759e25c6d3c03d"} Dec 04 10:36:10 crc kubenswrapper[4776]: I1204 10:36:10.290498 4776 generic.go:334] "Generic (PLEG): container finished" podID="7556826b-1a16-4285-918b-1df6afb2bfad" containerID="f6018ca2659a65e13be45a5e1ab600ad799b893255362a7b41759e25c6d3c03d" exitCode=0 Dec 04 10:36:10 crc kubenswrapper[4776]: I1204 10:36:10.290580 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerDied","Data":"f6018ca2659a65e13be45a5e1ab600ad799b893255362a7b41759e25c6d3c03d"} Dec 04 10:36:11 crc kubenswrapper[4776]: I1204 10:36:11.303765 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerStarted","Data":"2e4a8637c1f0314e017b1a382bb779f013b90143b6dd027a396837d30f94e3cc"} Dec 04 10:36:11 crc kubenswrapper[4776]: I1204 10:36:11.336404 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g7ds7" podStartSLOduration=2.913541474 podStartE2EDuration="5.336375813s" podCreationTimestamp="2025-12-04 10:36:06 +0000 UTC" firstStartedPulling="2025-12-04 10:36:08.26698286 +0000 UTC m=+3413.133463237" lastFinishedPulling="2025-12-04 10:36:10.689817199 +0000 UTC m=+3415.556297576" observedRunningTime="2025-12-04 10:36:11.322712393 +0000 UTC m=+3416.189192790" watchObservedRunningTime="2025-12-04 10:36:11.336375813 +0000 UTC m=+3416.202856200" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.629740 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.631819 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.634361 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.634669 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.634839 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rjhhj" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.640268 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.643536 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.730708 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.731186 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.731370 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.833996 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834051 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834082 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834143 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834348 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgvbx\" (UniqueName: \"kubernetes.io/projected/5e42f4d6-4793-4568-9a55-4d346b39dbac-kube-api-access-jgvbx\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834387 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834419 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834551 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.834666 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.835385 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.835546 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.844410 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.894535 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.894586 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.936564 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.936628 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.936648 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgvbx\" (UniqueName: \"kubernetes.io/projected/5e42f4d6-4793-4568-9a55-4d346b39dbac-kube-api-access-jgvbx\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.936670 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.936706 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.936791 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.937105 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.937235 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.938516 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.941000 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.944280 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.950810 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.962186 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgvbx\" (UniqueName: \"kubernetes.io/projected/5e42f4d6-4793-4568-9a55-4d346b39dbac-kube-api-access-jgvbx\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:16 crc kubenswrapper[4776]: I1204 10:36:16.965976 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " pod="openstack/tempest-tests-tempest" Dec 04 10:36:17 crc kubenswrapper[4776]: I1204 10:36:17.264226 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 10:36:17 crc kubenswrapper[4776]: I1204 10:36:17.465478 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:17 crc kubenswrapper[4776]: I1204 10:36:17.514271 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7ds7"] Dec 04 10:36:17 crc kubenswrapper[4776]: I1204 10:36:17.754123 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 10:36:18 crc kubenswrapper[4776]: I1204 10:36:18.418133 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e42f4d6-4793-4568-9a55-4d346b39dbac","Type":"ContainerStarted","Data":"a5627ffccb47c4c0c083e3377a9056f9b22ac438a233a571be3d497578f4162f"} Dec 04 10:36:19 crc kubenswrapper[4776]: I1204 10:36:19.380589 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:36:19 crc kubenswrapper[4776]: I1204 10:36:19.381291 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:36:19 crc kubenswrapper[4776]: I1204 10:36:19.381389 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:36:19 crc kubenswrapper[4776]: I1204 10:36:19.382741 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28b06bab60fad7f595377c61d0d33cf6466b8ad6aa3300c1a8a4c45dfe1ba590"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:36:19 crc kubenswrapper[4776]: I1204 10:36:19.382818 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://28b06bab60fad7f595377c61d0d33cf6466b8ad6aa3300c1a8a4c45dfe1ba590" gracePeriod=600 Dec 04 10:36:19 crc kubenswrapper[4776]: I1204 10:36:19.429345 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g7ds7" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="registry-server" containerID="cri-o://2e4a8637c1f0314e017b1a382bb779f013b90143b6dd027a396837d30f94e3cc" gracePeriod=2 Dec 04 10:36:20 crc kubenswrapper[4776]: I1204 10:36:20.447765 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="28b06bab60fad7f595377c61d0d33cf6466b8ad6aa3300c1a8a4c45dfe1ba590" exitCode=0 Dec 04 10:36:20 crc kubenswrapper[4776]: I1204 10:36:20.447799 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"28b06bab60fad7f595377c61d0d33cf6466b8ad6aa3300c1a8a4c45dfe1ba590"} Dec 04 10:36:20 crc kubenswrapper[4776]: I1204 10:36:20.448158 4776 scope.go:117] "RemoveContainer" containerID="634269b397b00a36f1a9939964b68f2a49d5c5180978a2374dd2c8d258b243a0" Dec 04 10:36:20 crc kubenswrapper[4776]: I1204 10:36:20.452710 4776 generic.go:334] "Generic (PLEG): container finished" podID="7556826b-1a16-4285-918b-1df6afb2bfad" containerID="2e4a8637c1f0314e017b1a382bb779f013b90143b6dd027a396837d30f94e3cc" exitCode=0 Dec 04 10:36:20 crc kubenswrapper[4776]: I1204 10:36:20.452779 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerDied","Data":"2e4a8637c1f0314e017b1a382bb779f013b90143b6dd027a396837d30f94e3cc"} Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.496888 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7ds7" event={"ID":"7556826b-1a16-4285-918b-1df6afb2bfad","Type":"ContainerDied","Data":"4e24ed89751e324173e13e6bc19cce4c3dbd380965c5907557607e45c362a101"} Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.497432 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e24ed89751e324173e13e6bc19cce4c3dbd380965c5907557607e45c362a101" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.552413 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.714110 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-utilities\") pod \"7556826b-1a16-4285-918b-1df6afb2bfad\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.714311 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k5h9\" (UniqueName: \"kubernetes.io/projected/7556826b-1a16-4285-918b-1df6afb2bfad-kube-api-access-5k5h9\") pod \"7556826b-1a16-4285-918b-1df6afb2bfad\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.714343 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-catalog-content\") pod \"7556826b-1a16-4285-918b-1df6afb2bfad\" (UID: \"7556826b-1a16-4285-918b-1df6afb2bfad\") " Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.716346 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-utilities" (OuterVolumeSpecName: "utilities") pod "7556826b-1a16-4285-918b-1df6afb2bfad" (UID: "7556826b-1a16-4285-918b-1df6afb2bfad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.722514 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7556826b-1a16-4285-918b-1df6afb2bfad-kube-api-access-5k5h9" (OuterVolumeSpecName: "kube-api-access-5k5h9") pod "7556826b-1a16-4285-918b-1df6afb2bfad" (UID: "7556826b-1a16-4285-918b-1df6afb2bfad"). InnerVolumeSpecName "kube-api-access-5k5h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.763780 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7556826b-1a16-4285-918b-1df6afb2bfad" (UID: "7556826b-1a16-4285-918b-1df6afb2bfad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.817991 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k5h9\" (UniqueName: \"kubernetes.io/projected/7556826b-1a16-4285-918b-1df6afb2bfad-kube-api-access-5k5h9\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.818332 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:23 crc kubenswrapper[4776]: I1204 10:36:23.818342 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7556826b-1a16-4285-918b-1df6afb2bfad-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:36:24 crc kubenswrapper[4776]: I1204 10:36:24.510206 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7ds7" Dec 04 10:36:24 crc kubenswrapper[4776]: I1204 10:36:24.510422 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5"} Dec 04 10:36:24 crc kubenswrapper[4776]: I1204 10:36:24.555814 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7ds7"] Dec 04 10:36:24 crc kubenswrapper[4776]: I1204 10:36:24.565992 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g7ds7"] Dec 04 10:36:25 crc kubenswrapper[4776]: I1204 10:36:25.464164 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" path="/var/lib/kubelet/pods/7556826b-1a16-4285-918b-1df6afb2bfad/volumes" Dec 04 10:36:45 crc kubenswrapper[4776]: E1204 10:36:45.654764 4776 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 04 10:36:45 crc kubenswrapper[4776]: E1204 10:36:45.655403 4776 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgvbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5e42f4d6-4793-4568-9a55-4d346b39dbac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 10:36:45 crc kubenswrapper[4776]: E1204 10:36:45.656821 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5e42f4d6-4793-4568-9a55-4d346b39dbac" Dec 04 10:36:45 crc kubenswrapper[4776]: E1204 10:36:45.765070 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5e42f4d6-4793-4568-9a55-4d346b39dbac" Dec 04 10:37:01 crc kubenswrapper[4776]: I1204 10:37:01.913022 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e42f4d6-4793-4568-9a55-4d346b39dbac","Type":"ContainerStarted","Data":"8f847996b183a6b4849fb229ff052b1260206402612e6909f20dacf5262f7778"} Dec 04 10:37:01 crc kubenswrapper[4776]: I1204 10:37:01.942466 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.787779657 podStartE2EDuration="46.942443086s" podCreationTimestamp="2025-12-04 10:36:15 +0000 UTC" firstStartedPulling="2025-12-04 10:36:17.761293321 +0000 UTC m=+3422.627773698" lastFinishedPulling="2025-12-04 10:36:59.91595675 +0000 UTC m=+3464.782437127" observedRunningTime="2025-12-04 10:37:01.929153988 +0000 UTC m=+3466.795634365" watchObservedRunningTime="2025-12-04 10:37:01.942443086 +0000 UTC m=+3466.808923463" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.266506 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkt4"] Dec 04 10:37:43 crc kubenswrapper[4776]: E1204 10:37:43.267591 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="extract-content" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.267606 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="extract-content" Dec 04 10:37:43 crc kubenswrapper[4776]: E1204 10:37:43.267629 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="extract-utilities" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.267637 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="extract-utilities" Dec 04 10:37:43 crc kubenswrapper[4776]: E1204 10:37:43.267655 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="registry-server" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.267661 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="registry-server" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.267884 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7556826b-1a16-4285-918b-1df6afb2bfad" containerName="registry-server" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.270041 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.276517 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkt4"] Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.412157 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-catalog-content\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.412427 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpv54\" (UniqueName: \"kubernetes.io/projected/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-kube-api-access-jpv54\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.412510 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-utilities\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.514929 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpv54\" (UniqueName: \"kubernetes.io/projected/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-kube-api-access-jpv54\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.515002 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-utilities\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.515171 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-catalog-content\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.515728 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-utilities\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.515757 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-catalog-content\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.535320 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpv54\" (UniqueName: \"kubernetes.io/projected/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-kube-api-access-jpv54\") pod \"redhat-marketplace-8gkt4\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.603014 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.870121 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rtftg"] Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.873029 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:43 crc kubenswrapper[4776]: I1204 10:37:43.895675 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtftg"] Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.034489 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fmvd\" (UniqueName: \"kubernetes.io/projected/7ab357bb-3132-4876-93c9-ee0dd5e50a72-kube-api-access-9fmvd\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.034572 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-utilities\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.034617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-catalog-content\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.133990 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkt4"] Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.136273 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fmvd\" (UniqueName: \"kubernetes.io/projected/7ab357bb-3132-4876-93c9-ee0dd5e50a72-kube-api-access-9fmvd\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.136359 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-utilities\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.136412 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-catalog-content\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.137027 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-utilities\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.137045 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-catalog-content\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.159626 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fmvd\" (UniqueName: \"kubernetes.io/projected/7ab357bb-3132-4876-93c9-ee0dd5e50a72-kube-api-access-9fmvd\") pod \"redhat-operators-rtftg\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.228718 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.570564 4776 generic.go:334] "Generic (PLEG): container finished" podID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerID="01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a" exitCode=0 Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.570679 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerDied","Data":"01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a"} Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.570972 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerStarted","Data":"933cd688908e4a637d5c9269dd9c35b4e3c8ac816ae3f66cbc6a25d5b6b89717"} Dec 04 10:37:44 crc kubenswrapper[4776]: I1204 10:37:44.769045 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtftg"] Dec 04 10:37:44 crc kubenswrapper[4776]: W1204 10:37:44.774535 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ab357bb_3132_4876_93c9_ee0dd5e50a72.slice/crio-8ac4a3d220582982b492ca029be9b33fc86bb759260705fb1e96d22ce5e18152 WatchSource:0}: Error finding container 8ac4a3d220582982b492ca029be9b33fc86bb759260705fb1e96d22ce5e18152: Status 404 returned error can't find the container with id 8ac4a3d220582982b492ca029be9b33fc86bb759260705fb1e96d22ce5e18152 Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.585220 4776 generic.go:334] "Generic (PLEG): container finished" podID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerID="b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f" exitCode=0 Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.585303 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerDied","Data":"b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f"} Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.585553 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerStarted","Data":"8ac4a3d220582982b492ca029be9b33fc86bb759260705fb1e96d22ce5e18152"} Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.591222 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerStarted","Data":"35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0"} Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.672226 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5lqmp"] Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.674841 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.702494 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5lqmp"] Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.706549 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-catalog-content\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.706651 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-utilities\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.706993 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk44d\" (UniqueName: \"kubernetes.io/projected/34139417-48ad-4ead-a70a-90654e17872b-kube-api-access-lk44d\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.809953 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk44d\" (UniqueName: \"kubernetes.io/projected/34139417-48ad-4ead-a70a-90654e17872b-kube-api-access-lk44d\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.810025 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-catalog-content\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.810109 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-utilities\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.810657 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-utilities\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.811316 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-catalog-content\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:45 crc kubenswrapper[4776]: I1204 10:37:45.840019 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk44d\" (UniqueName: \"kubernetes.io/projected/34139417-48ad-4ead-a70a-90654e17872b-kube-api-access-lk44d\") pod \"certified-operators-5lqmp\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:46 crc kubenswrapper[4776]: I1204 10:37:46.019094 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:46 crc kubenswrapper[4776]: I1204 10:37:46.390947 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5lqmp"] Dec 04 10:37:46 crc kubenswrapper[4776]: W1204 10:37:46.415883 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34139417_48ad_4ead_a70a_90654e17872b.slice/crio-d94bccb7ddd4581076a9fe4e71a692b1f3468ddadb27e9d33defe01f8deede50 WatchSource:0}: Error finding container d94bccb7ddd4581076a9fe4e71a692b1f3468ddadb27e9d33defe01f8deede50: Status 404 returned error can't find the container with id d94bccb7ddd4581076a9fe4e71a692b1f3468ddadb27e9d33defe01f8deede50 Dec 04 10:37:46 crc kubenswrapper[4776]: I1204 10:37:46.602824 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerStarted","Data":"27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396"} Dec 04 10:37:46 crc kubenswrapper[4776]: I1204 10:37:46.606053 4776 generic.go:334] "Generic (PLEG): container finished" podID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerID="35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0" exitCode=0 Dec 04 10:37:46 crc kubenswrapper[4776]: I1204 10:37:46.606153 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerDied","Data":"35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0"} Dec 04 10:37:46 crc kubenswrapper[4776]: I1204 10:37:46.607898 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerStarted","Data":"d94bccb7ddd4581076a9fe4e71a692b1f3468ddadb27e9d33defe01f8deede50"} Dec 04 10:37:47 crc kubenswrapper[4776]: I1204 10:37:47.619234 4776 generic.go:334] "Generic (PLEG): container finished" podID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerID="27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396" exitCode=0 Dec 04 10:37:47 crc kubenswrapper[4776]: I1204 10:37:47.619287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerDied","Data":"27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396"} Dec 04 10:37:47 crc kubenswrapper[4776]: I1204 10:37:47.625322 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerStarted","Data":"12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a"} Dec 04 10:37:47 crc kubenswrapper[4776]: I1204 10:37:47.627348 4776 generic.go:334] "Generic (PLEG): container finished" podID="34139417-48ad-4ead-a70a-90654e17872b" containerID="84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd" exitCode=0 Dec 04 10:37:47 crc kubenswrapper[4776]: I1204 10:37:47.627386 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerDied","Data":"84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd"} Dec 04 10:37:47 crc kubenswrapper[4776]: I1204 10:37:47.698066 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8gkt4" podStartSLOduration=1.942731105 podStartE2EDuration="4.698035034s" podCreationTimestamp="2025-12-04 10:37:43 +0000 UTC" firstStartedPulling="2025-12-04 10:37:44.572703061 +0000 UTC m=+3509.439183438" lastFinishedPulling="2025-12-04 10:37:47.32800699 +0000 UTC m=+3512.194487367" observedRunningTime="2025-12-04 10:37:47.691086105 +0000 UTC m=+3512.557566492" watchObservedRunningTime="2025-12-04 10:37:47.698035034 +0000 UTC m=+3512.564515411" Dec 04 10:37:48 crc kubenswrapper[4776]: I1204 10:37:48.638429 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerStarted","Data":"b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a"} Dec 04 10:37:48 crc kubenswrapper[4776]: I1204 10:37:48.640880 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerStarted","Data":"0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05"} Dec 04 10:37:48 crc kubenswrapper[4776]: I1204 10:37:48.665944 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rtftg" podStartSLOduration=3.218223718 podStartE2EDuration="5.665894228s" podCreationTimestamp="2025-12-04 10:37:43 +0000 UTC" firstStartedPulling="2025-12-04 10:37:45.587978597 +0000 UTC m=+3510.454458974" lastFinishedPulling="2025-12-04 10:37:48.035649107 +0000 UTC m=+3512.902129484" observedRunningTime="2025-12-04 10:37:48.65861859 +0000 UTC m=+3513.525098967" watchObservedRunningTime="2025-12-04 10:37:48.665894228 +0000 UTC m=+3513.532374605" Dec 04 10:37:52 crc kubenswrapper[4776]: I1204 10:37:52.678684 4776 generic.go:334] "Generic (PLEG): container finished" podID="34139417-48ad-4ead-a70a-90654e17872b" containerID="0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05" exitCode=0 Dec 04 10:37:52 crc kubenswrapper[4776]: I1204 10:37:52.678779 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerDied","Data":"0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05"} Dec 04 10:37:53 crc kubenswrapper[4776]: I1204 10:37:53.603811 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:53 crc kubenswrapper[4776]: I1204 10:37:53.604148 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:53 crc kubenswrapper[4776]: I1204 10:37:53.653522 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:53 crc kubenswrapper[4776]: I1204 10:37:53.738287 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:54 crc kubenswrapper[4776]: I1204 10:37:54.058362 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkt4"] Dec 04 10:37:54 crc kubenswrapper[4776]: I1204 10:37:54.229798 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:54 crc kubenswrapper[4776]: I1204 10:37:54.229867 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:54 crc kubenswrapper[4776]: I1204 10:37:54.279353 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:54 crc kubenswrapper[4776]: I1204 10:37:54.748355 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:55 crc kubenswrapper[4776]: I1204 10:37:55.707624 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerStarted","Data":"21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0"} Dec 04 10:37:55 crc kubenswrapper[4776]: I1204 10:37:55.707807 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8gkt4" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="registry-server" containerID="cri-o://12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a" gracePeriod=2 Dec 04 10:37:55 crc kubenswrapper[4776]: I1204 10:37:55.738126 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5lqmp" podStartSLOduration=3.6315086069999998 podStartE2EDuration="10.738096504s" podCreationTimestamp="2025-12-04 10:37:45 +0000 UTC" firstStartedPulling="2025-12-04 10:37:47.6305501 +0000 UTC m=+3512.497030467" lastFinishedPulling="2025-12-04 10:37:54.737137977 +0000 UTC m=+3519.603618364" observedRunningTime="2025-12-04 10:37:55.729971259 +0000 UTC m=+3520.596451646" watchObservedRunningTime="2025-12-04 10:37:55.738096504 +0000 UTC m=+3520.604576881" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.020140 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.020546 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.254197 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.272615 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-utilities\") pod \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.272958 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-catalog-content\") pod \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.273102 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpv54\" (UniqueName: \"kubernetes.io/projected/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-kube-api-access-jpv54\") pod \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\" (UID: \"cf5f8d8f-8260-42cc-91ec-2fd1d7877451\") " Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.273305 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-utilities" (OuterVolumeSpecName: "utilities") pod "cf5f8d8f-8260-42cc-91ec-2fd1d7877451" (UID: "cf5f8d8f-8260-42cc-91ec-2fd1d7877451"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.274587 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.282223 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-kube-api-access-jpv54" (OuterVolumeSpecName: "kube-api-access-jpv54") pod "cf5f8d8f-8260-42cc-91ec-2fd1d7877451" (UID: "cf5f8d8f-8260-42cc-91ec-2fd1d7877451"). InnerVolumeSpecName "kube-api-access-jpv54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.298131 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf5f8d8f-8260-42cc-91ec-2fd1d7877451" (UID: "cf5f8d8f-8260-42cc-91ec-2fd1d7877451"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.376720 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpv54\" (UniqueName: \"kubernetes.io/projected/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-kube-api-access-jpv54\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.376753 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf5f8d8f-8260-42cc-91ec-2fd1d7877451-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.658189 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtftg"] Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.719108 4776 generic.go:334] "Generic (PLEG): container finished" podID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerID="12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a" exitCode=0 Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.719175 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerDied","Data":"12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a"} Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.719221 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gkt4" event={"ID":"cf5f8d8f-8260-42cc-91ec-2fd1d7877451","Type":"ContainerDied","Data":"933cd688908e4a637d5c9269dd9c35b4e3c8ac816ae3f66cbc6a25d5b6b89717"} Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.719222 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gkt4" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.719254 4776 scope.go:117] "RemoveContainer" containerID="12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.719368 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rtftg" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="registry-server" containerID="cri-o://b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a" gracePeriod=2 Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.739879 4776 scope.go:117] "RemoveContainer" containerID="35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.761564 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkt4"] Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.770471 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gkt4"] Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.773667 4776 scope.go:117] "RemoveContainer" containerID="01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.934146 4776 scope.go:117] "RemoveContainer" containerID="12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a" Dec 04 10:37:56 crc kubenswrapper[4776]: E1204 10:37:56.934540 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a\": container with ID starting with 12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a not found: ID does not exist" containerID="12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.934579 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a"} err="failed to get container status \"12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a\": rpc error: code = NotFound desc = could not find container \"12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a\": container with ID starting with 12ef0959b3523280cbaa569cd30dae327c11e4c264029950618e73fb02a8156a not found: ID does not exist" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.934609 4776 scope.go:117] "RemoveContainer" containerID="35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0" Dec 04 10:37:56 crc kubenswrapper[4776]: E1204 10:37:56.934957 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0\": container with ID starting with 35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0 not found: ID does not exist" containerID="35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.935015 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0"} err="failed to get container status \"35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0\": rpc error: code = NotFound desc = could not find container \"35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0\": container with ID starting with 35794bc52414ef185c72283e454c398fb3fc066023b9dd4c32cf7b2a5bfd9dc0 not found: ID does not exist" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.935053 4776 scope.go:117] "RemoveContainer" containerID="01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a" Dec 04 10:37:56 crc kubenswrapper[4776]: E1204 10:37:56.935422 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a\": container with ID starting with 01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a not found: ID does not exist" containerID="01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a" Dec 04 10:37:56 crc kubenswrapper[4776]: I1204 10:37:56.935446 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a"} err="failed to get container status \"01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a\": rpc error: code = NotFound desc = could not find container \"01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a\": container with ID starting with 01b07c95c6d16906a239244f37c19f49f2a046853075bf4fee840c12ccef382a not found: ID does not exist" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.079112 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5lqmp" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="registry-server" probeResult="failure" output=< Dec 04 10:37:57 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 10:37:57 crc kubenswrapper[4776]: > Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.184294 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.295935 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-catalog-content\") pod \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.296071 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-utilities\") pod \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.296447 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fmvd\" (UniqueName: \"kubernetes.io/projected/7ab357bb-3132-4876-93c9-ee0dd5e50a72-kube-api-access-9fmvd\") pod \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\" (UID: \"7ab357bb-3132-4876-93c9-ee0dd5e50a72\") " Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.297309 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-utilities" (OuterVolumeSpecName: "utilities") pod "7ab357bb-3132-4876-93c9-ee0dd5e50a72" (UID: "7ab357bb-3132-4876-93c9-ee0dd5e50a72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.302545 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab357bb-3132-4876-93c9-ee0dd5e50a72-kube-api-access-9fmvd" (OuterVolumeSpecName: "kube-api-access-9fmvd") pod "7ab357bb-3132-4876-93c9-ee0dd5e50a72" (UID: "7ab357bb-3132-4876-93c9-ee0dd5e50a72"). InnerVolumeSpecName "kube-api-access-9fmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.398788 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.398824 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fmvd\" (UniqueName: \"kubernetes.io/projected/7ab357bb-3132-4876-93c9-ee0dd5e50a72-kube-api-access-9fmvd\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.402960 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ab357bb-3132-4876-93c9-ee0dd5e50a72" (UID: "7ab357bb-3132-4876-93c9-ee0dd5e50a72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.465082 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" path="/var/lib/kubelet/pods/cf5f8d8f-8260-42cc-91ec-2fd1d7877451/volumes" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.500941 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab357bb-3132-4876-93c9-ee0dd5e50a72-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.731834 4776 generic.go:334] "Generic (PLEG): container finished" podID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerID="b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a" exitCode=0 Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.731955 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtftg" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.731990 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerDied","Data":"b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a"} Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.732037 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtftg" event={"ID":"7ab357bb-3132-4876-93c9-ee0dd5e50a72","Type":"ContainerDied","Data":"8ac4a3d220582982b492ca029be9b33fc86bb759260705fb1e96d22ce5e18152"} Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.732059 4776 scope.go:117] "RemoveContainer" containerID="b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.756310 4776 scope.go:117] "RemoveContainer" containerID="27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.756794 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtftg"] Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.767897 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rtftg"] Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.780151 4776 scope.go:117] "RemoveContainer" containerID="b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.797454 4776 scope.go:117] "RemoveContainer" containerID="b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a" Dec 04 10:37:57 crc kubenswrapper[4776]: E1204 10:37:57.798859 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a\": container with ID starting with b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a not found: ID does not exist" containerID="b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.798935 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a"} err="failed to get container status \"b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a\": rpc error: code = NotFound desc = could not find container \"b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a\": container with ID starting with b0d4f0fd8b79b454a8d2447d8724a38711a80670b76ec25cbe6f5d915a26fe5a not found: ID does not exist" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.798972 4776 scope.go:117] "RemoveContainer" containerID="27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396" Dec 04 10:37:57 crc kubenswrapper[4776]: E1204 10:37:57.799332 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396\": container with ID starting with 27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396 not found: ID does not exist" containerID="27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.799370 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396"} err="failed to get container status \"27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396\": rpc error: code = NotFound desc = could not find container \"27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396\": container with ID starting with 27e687036d450120e9056e571e9713fd0cf8cd7ac36c30d80c5d9f52448df396 not found: ID does not exist" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.799396 4776 scope.go:117] "RemoveContainer" containerID="b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f" Dec 04 10:37:57 crc kubenswrapper[4776]: E1204 10:37:57.799665 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f\": container with ID starting with b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f not found: ID does not exist" containerID="b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f" Dec 04 10:37:57 crc kubenswrapper[4776]: I1204 10:37:57.799697 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f"} err="failed to get container status \"b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f\": rpc error: code = NotFound desc = could not find container \"b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f\": container with ID starting with b77f0dc39b2a3f9e2e60d25146da0ad571e6f32ca53d39b53f66305e21d74c0f not found: ID does not exist" Dec 04 10:37:59 crc kubenswrapper[4776]: I1204 10:37:59.466391 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" path="/var/lib/kubelet/pods/7ab357bb-3132-4876-93c9-ee0dd5e50a72/volumes" Dec 04 10:38:06 crc kubenswrapper[4776]: I1204 10:38:06.070744 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:38:06 crc kubenswrapper[4776]: I1204 10:38:06.122400 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:38:06 crc kubenswrapper[4776]: I1204 10:38:06.307669 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5lqmp"] Dec 04 10:38:07 crc kubenswrapper[4776]: I1204 10:38:07.837738 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5lqmp" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="registry-server" containerID="cri-o://21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0" gracePeriod=2 Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.314155 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.414672 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-utilities\") pod \"34139417-48ad-4ead-a70a-90654e17872b\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.414790 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-catalog-content\") pod \"34139417-48ad-4ead-a70a-90654e17872b\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.414910 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk44d\" (UniqueName: \"kubernetes.io/projected/34139417-48ad-4ead-a70a-90654e17872b-kube-api-access-lk44d\") pod \"34139417-48ad-4ead-a70a-90654e17872b\" (UID: \"34139417-48ad-4ead-a70a-90654e17872b\") " Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.415496 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-utilities" (OuterVolumeSpecName: "utilities") pod "34139417-48ad-4ead-a70a-90654e17872b" (UID: "34139417-48ad-4ead-a70a-90654e17872b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.427266 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34139417-48ad-4ead-a70a-90654e17872b-kube-api-access-lk44d" (OuterVolumeSpecName: "kube-api-access-lk44d") pod "34139417-48ad-4ead-a70a-90654e17872b" (UID: "34139417-48ad-4ead-a70a-90654e17872b"). InnerVolumeSpecName "kube-api-access-lk44d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.478737 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34139417-48ad-4ead-a70a-90654e17872b" (UID: "34139417-48ad-4ead-a70a-90654e17872b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.516906 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.516961 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34139417-48ad-4ead-a70a-90654e17872b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.516973 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk44d\" (UniqueName: \"kubernetes.io/projected/34139417-48ad-4ead-a70a-90654e17872b-kube-api-access-lk44d\") on node \"crc\" DevicePath \"\"" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.850349 4776 generic.go:334] "Generic (PLEG): container finished" podID="34139417-48ad-4ead-a70a-90654e17872b" containerID="21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0" exitCode=0 Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.850453 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lqmp" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.851420 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerDied","Data":"21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0"} Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.851533 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lqmp" event={"ID":"34139417-48ad-4ead-a70a-90654e17872b","Type":"ContainerDied","Data":"d94bccb7ddd4581076a9fe4e71a692b1f3468ddadb27e9d33defe01f8deede50"} Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.851605 4776 scope.go:117] "RemoveContainer" containerID="21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.879135 4776 scope.go:117] "RemoveContainer" containerID="0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.891669 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5lqmp"] Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.900578 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5lqmp"] Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.924260 4776 scope.go:117] "RemoveContainer" containerID="84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.961728 4776 scope.go:117] "RemoveContainer" containerID="21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0" Dec 04 10:38:08 crc kubenswrapper[4776]: E1204 10:38:08.962361 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0\": container with ID starting with 21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0 not found: ID does not exist" containerID="21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.962412 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0"} err="failed to get container status \"21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0\": rpc error: code = NotFound desc = could not find container \"21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0\": container with ID starting with 21549d6eccc6307c4550eb6566dc1e6463dc4e9feda4b506cd6a75a73c9f1cb0 not found: ID does not exist" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.962569 4776 scope.go:117] "RemoveContainer" containerID="0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05" Dec 04 10:38:08 crc kubenswrapper[4776]: E1204 10:38:08.962970 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05\": container with ID starting with 0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05 not found: ID does not exist" containerID="0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.962996 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05"} err="failed to get container status \"0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05\": rpc error: code = NotFound desc = could not find container \"0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05\": container with ID starting with 0b548f45a668d425d66aae7c94fb5e26e3dbe03c35385ab52a8f06c61f532f05 not found: ID does not exist" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.963030 4776 scope.go:117] "RemoveContainer" containerID="84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd" Dec 04 10:38:08 crc kubenswrapper[4776]: E1204 10:38:08.963233 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd\": container with ID starting with 84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd not found: ID does not exist" containerID="84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd" Dec 04 10:38:08 crc kubenswrapper[4776]: I1204 10:38:08.963278 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd"} err="failed to get container status \"84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd\": rpc error: code = NotFound desc = could not find container \"84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd\": container with ID starting with 84c28a13018d2c51da2933167a9203d4d166018f3b254e13c71cf8529210febd not found: ID does not exist" Dec 04 10:38:09 crc kubenswrapper[4776]: I1204 10:38:09.464881 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34139417-48ad-4ead-a70a-90654e17872b" path="/var/lib/kubelet/pods/34139417-48ad-4ead-a70a-90654e17872b/volumes" Dec 04 10:38:49 crc kubenswrapper[4776]: I1204 10:38:49.380788 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:38:49 crc kubenswrapper[4776]: I1204 10:38:49.381820 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:39:19 crc kubenswrapper[4776]: I1204 10:39:19.379580 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:39:19 crc kubenswrapper[4776]: I1204 10:39:19.380164 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.379909 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.380445 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.380496 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.381245 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.381293 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" gracePeriod=600 Dec 04 10:39:49 crc kubenswrapper[4776]: E1204 10:39:49.513589 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.833781 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" exitCode=0 Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.833862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5"} Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.834669 4776 scope.go:117] "RemoveContainer" containerID="28b06bab60fad7f595377c61d0d33cf6466b8ad6aa3300c1a8a4c45dfe1ba590" Dec 04 10:39:49 crc kubenswrapper[4776]: I1204 10:39:49.835419 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:39:49 crc kubenswrapper[4776]: E1204 10:39:49.835864 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:40:02 crc kubenswrapper[4776]: I1204 10:40:02.454592 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:40:02 crc kubenswrapper[4776]: E1204 10:40:02.455911 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:40:14 crc kubenswrapper[4776]: I1204 10:40:14.453211 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:40:14 crc kubenswrapper[4776]: E1204 10:40:14.454103 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:40:29 crc kubenswrapper[4776]: I1204 10:40:29.454303 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:40:29 crc kubenswrapper[4776]: E1204 10:40:29.455167 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:40:40 crc kubenswrapper[4776]: I1204 10:40:40.453000 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:40:40 crc kubenswrapper[4776]: E1204 10:40:40.453809 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:40:51 crc kubenswrapper[4776]: I1204 10:40:51.452646 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:40:51 crc kubenswrapper[4776]: E1204 10:40:51.453547 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:41:03 crc kubenswrapper[4776]: I1204 10:41:03.452210 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:41:03 crc kubenswrapper[4776]: E1204 10:41:03.452937 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:41:15 crc kubenswrapper[4776]: I1204 10:41:15.459833 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:41:15 crc kubenswrapper[4776]: E1204 10:41:15.460656 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:41:28 crc kubenswrapper[4776]: I1204 10:41:28.452342 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:41:28 crc kubenswrapper[4776]: E1204 10:41:28.453185 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:41:43 crc kubenswrapper[4776]: I1204 10:41:43.452123 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:41:43 crc kubenswrapper[4776]: E1204 10:41:43.452834 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:41:45 crc kubenswrapper[4776]: I1204 10:41:45.315356 4776 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-97ppf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 10:41:45 crc kubenswrapper[4776]: I1204 10:41:45.315863 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" podUID="d11245f8-3b53-4363-babf-6d47d9628e1b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 10:41:45 crc kubenswrapper[4776]: I1204 10:41:45.328354 4776 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-97ppf container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 10:41:45 crc kubenswrapper[4776]: I1204 10:41:45.328416 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-97ppf" podUID="d11245f8-3b53-4363-babf-6d47d9628e1b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 10:41:45 crc kubenswrapper[4776]: I1204 10:41:45.335173 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="be0c172a-45d2-4fab-940c-f343c9e227fc" containerName="galera" probeResult="failure" output="command timed out" Dec 04 10:41:45 crc kubenswrapper[4776]: I1204 10:41:45.335311 4776 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="be0c172a-45d2-4fab-940c-f343c9e227fc" containerName="galera" probeResult="failure" output="command timed out" Dec 04 10:41:56 crc kubenswrapper[4776]: I1204 10:41:56.453274 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:41:56 crc kubenswrapper[4776]: E1204 10:41:56.454480 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:42:08 crc kubenswrapper[4776]: I1204 10:42:08.453537 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:42:08 crc kubenswrapper[4776]: E1204 10:42:08.454682 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:42:22 crc kubenswrapper[4776]: I1204 10:42:22.454117 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:42:22 crc kubenswrapper[4776]: E1204 10:42:22.456533 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:42:23 crc kubenswrapper[4776]: I1204 10:42:23.429295 4776 scope.go:117] "RemoveContainer" containerID="2e4a8637c1f0314e017b1a382bb779f013b90143b6dd027a396837d30f94e3cc" Dec 04 10:42:23 crc kubenswrapper[4776]: I1204 10:42:23.451652 4776 scope.go:117] "RemoveContainer" containerID="f6018ca2659a65e13be45a5e1ab600ad799b893255362a7b41759e25c6d3c03d" Dec 04 10:42:23 crc kubenswrapper[4776]: I1204 10:42:23.472380 4776 scope.go:117] "RemoveContainer" containerID="497c9316178c9fdc24d67e99233e1853dcde957070fad41f005c26bee979d09d" Dec 04 10:42:34 crc kubenswrapper[4776]: I1204 10:42:34.452780 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:42:34 crc kubenswrapper[4776]: E1204 10:42:34.453488 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:42:46 crc kubenswrapper[4776]: I1204 10:42:46.452752 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:42:46 crc kubenswrapper[4776]: E1204 10:42:46.453488 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:42:59 crc kubenswrapper[4776]: I1204 10:42:59.451991 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:42:59 crc kubenswrapper[4776]: E1204 10:42:59.452726 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:43:12 crc kubenswrapper[4776]: I1204 10:43:12.452855 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:43:12 crc kubenswrapper[4776]: E1204 10:43:12.453832 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:43:27 crc kubenswrapper[4776]: I1204 10:43:27.455894 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:43:27 crc kubenswrapper[4776]: E1204 10:43:27.456885 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:43:39 crc kubenswrapper[4776]: I1204 10:43:39.453223 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:43:39 crc kubenswrapper[4776]: E1204 10:43:39.464129 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:43:53 crc kubenswrapper[4776]: I1204 10:43:53.452662 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:43:53 crc kubenswrapper[4776]: E1204 10:43:53.453453 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:43:58 crc kubenswrapper[4776]: I1204 10:43:58.039981 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-xz78f"] Dec 04 10:43:58 crc kubenswrapper[4776]: I1204 10:43:58.048200 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-xz78f"] Dec 04 10:43:59 crc kubenswrapper[4776]: I1204 10:43:59.038726 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-a778-account-create-update-4tzth"] Dec 04 10:43:59 crc kubenswrapper[4776]: I1204 10:43:59.050403 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-a778-account-create-update-4tzth"] Dec 04 10:43:59 crc kubenswrapper[4776]: I1204 10:43:59.463690 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de77ab0-bcef-4f39-b1f5-10ea8feddbed" path="/var/lib/kubelet/pods/3de77ab0-bcef-4f39-b1f5-10ea8feddbed/volumes" Dec 04 10:43:59 crc kubenswrapper[4776]: I1204 10:43:59.464418 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62604a5a-c38e-4972-92ee-a103a6214b3d" path="/var/lib/kubelet/pods/62604a5a-c38e-4972-92ee-a103a6214b3d/volumes" Dec 04 10:44:04 crc kubenswrapper[4776]: I1204 10:44:04.453063 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:44:04 crc kubenswrapper[4776]: E1204 10:44:04.455800 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:44:18 crc kubenswrapper[4776]: I1204 10:44:18.452536 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:44:18 crc kubenswrapper[4776]: E1204 10:44:18.453334 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:44:23 crc kubenswrapper[4776]: I1204 10:44:23.574463 4776 scope.go:117] "RemoveContainer" containerID="9851c74bf0b4ea7fd66e142ac5d05c7130d8784d6732a058ee1926e5787ad49d" Dec 04 10:44:23 crc kubenswrapper[4776]: I1204 10:44:23.598904 4776 scope.go:117] "RemoveContainer" containerID="82612873a29b4d392dc8016f0766b610029f1dc5bee7cac7631e657f1b1f4bc3" Dec 04 10:44:30 crc kubenswrapper[4776]: I1204 10:44:30.064124 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-svl4r"] Dec 04 10:44:30 crc kubenswrapper[4776]: I1204 10:44:30.080019 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-svl4r"] Dec 04 10:44:31 crc kubenswrapper[4776]: I1204 10:44:31.464748 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7479329d-b468-4068-b9a7-3bf148b4a299" path="/var/lib/kubelet/pods/7479329d-b468-4068-b9a7-3bf148b4a299/volumes" Dec 04 10:44:33 crc kubenswrapper[4776]: I1204 10:44:33.452430 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:44:33 crc kubenswrapper[4776]: E1204 10:44:33.453111 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:44:48 crc kubenswrapper[4776]: I1204 10:44:48.452507 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:44:48 crc kubenswrapper[4776]: E1204 10:44:48.453735 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:44:59 crc kubenswrapper[4776]: I1204 10:44:59.452817 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.092726 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"de6ae4981b991fcd0800f65005c52451c9f035f8277040b04437c770c82be074"} Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.196124 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx"] Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197001 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197016 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197032 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197039 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197053 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="extract-content" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197059 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="extract-content" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197069 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="extract-content" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197075 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="extract-content" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197083 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="extract-utilities" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197089 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="extract-utilities" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197101 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="extract-utilities" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197107 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="extract-utilities" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197118 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="extract-content" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197123 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="extract-content" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197141 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197147 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: E1204 10:45:00.197172 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="extract-utilities" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197178 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="extract-utilities" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197433 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5f8d8f-8260-42cc-91ec-2fd1d7877451" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197463 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab357bb-3132-4876-93c9-ee0dd5e50a72" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.197478 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="34139417-48ad-4ead-a70a-90654e17872b" containerName="registry-server" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.198431 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.207206 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.207517 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.216802 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx"] Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.376248 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjlwc\" (UniqueName: \"kubernetes.io/projected/8031151f-715f-4639-b6f4-67a0d3be709f-kube-api-access-wjlwc\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.376593 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8031151f-715f-4639-b6f4-67a0d3be709f-config-volume\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.376696 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8031151f-715f-4639-b6f4-67a0d3be709f-secret-volume\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.478105 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8031151f-715f-4639-b6f4-67a0d3be709f-config-volume\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.478170 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8031151f-715f-4639-b6f4-67a0d3be709f-secret-volume\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.479036 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8031151f-715f-4639-b6f4-67a0d3be709f-config-volume\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.479229 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjlwc\" (UniqueName: \"kubernetes.io/projected/8031151f-715f-4639-b6f4-67a0d3be709f-kube-api-access-wjlwc\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.497887 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8031151f-715f-4639-b6f4-67a0d3be709f-secret-volume\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.509555 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjlwc\" (UniqueName: \"kubernetes.io/projected/8031151f-715f-4639-b6f4-67a0d3be709f-kube-api-access-wjlwc\") pod \"collect-profiles-29414085-k2cqx\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:00 crc kubenswrapper[4776]: I1204 10:45:00.524662 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:01 crc kubenswrapper[4776]: W1204 10:45:01.053714 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8031151f_715f_4639_b6f4_67a0d3be709f.slice/crio-0b88fb1564863f6146df1899f4561b244061816a6ce59471a95501e2e7cc0f9d WatchSource:0}: Error finding container 0b88fb1564863f6146df1899f4561b244061816a6ce59471a95501e2e7cc0f9d: Status 404 returned error can't find the container with id 0b88fb1564863f6146df1899f4561b244061816a6ce59471a95501e2e7cc0f9d Dec 04 10:45:01 crc kubenswrapper[4776]: I1204 10:45:01.056033 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx"] Dec 04 10:45:01 crc kubenswrapper[4776]: I1204 10:45:01.104476 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" event={"ID":"8031151f-715f-4639-b6f4-67a0d3be709f","Type":"ContainerStarted","Data":"0b88fb1564863f6146df1899f4561b244061816a6ce59471a95501e2e7cc0f9d"} Dec 04 10:45:02 crc kubenswrapper[4776]: I1204 10:45:02.116640 4776 generic.go:334] "Generic (PLEG): container finished" podID="8031151f-715f-4639-b6f4-67a0d3be709f" containerID="db0c7081f4e3eca023be40ea1bd98328c9141dbdfa21ca7dfc3178b9f25b51df" exitCode=0 Dec 04 10:45:02 crc kubenswrapper[4776]: I1204 10:45:02.117053 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" event={"ID":"8031151f-715f-4639-b6f4-67a0d3be709f","Type":"ContainerDied","Data":"db0c7081f4e3eca023be40ea1bd98328c9141dbdfa21ca7dfc3178b9f25b51df"} Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.531494 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.661038 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjlwc\" (UniqueName: \"kubernetes.io/projected/8031151f-715f-4639-b6f4-67a0d3be709f-kube-api-access-wjlwc\") pod \"8031151f-715f-4639-b6f4-67a0d3be709f\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.661528 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8031151f-715f-4639-b6f4-67a0d3be709f-secret-volume\") pod \"8031151f-715f-4639-b6f4-67a0d3be709f\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.661633 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8031151f-715f-4639-b6f4-67a0d3be709f-config-volume\") pod \"8031151f-715f-4639-b6f4-67a0d3be709f\" (UID: \"8031151f-715f-4639-b6f4-67a0d3be709f\") " Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.662518 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8031151f-715f-4639-b6f4-67a0d3be709f-config-volume" (OuterVolumeSpecName: "config-volume") pod "8031151f-715f-4639-b6f4-67a0d3be709f" (UID: "8031151f-715f-4639-b6f4-67a0d3be709f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.667101 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8031151f-715f-4639-b6f4-67a0d3be709f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8031151f-715f-4639-b6f4-67a0d3be709f" (UID: "8031151f-715f-4639-b6f4-67a0d3be709f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.676885 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8031151f-715f-4639-b6f4-67a0d3be709f-kube-api-access-wjlwc" (OuterVolumeSpecName: "kube-api-access-wjlwc") pod "8031151f-715f-4639-b6f4-67a0d3be709f" (UID: "8031151f-715f-4639-b6f4-67a0d3be709f"). InnerVolumeSpecName "kube-api-access-wjlwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.764397 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8031151f-715f-4639-b6f4-67a0d3be709f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.764596 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjlwc\" (UniqueName: \"kubernetes.io/projected/8031151f-715f-4639-b6f4-67a0d3be709f-kube-api-access-wjlwc\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:03 crc kubenswrapper[4776]: I1204 10:45:03.764658 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8031151f-715f-4639-b6f4-67a0d3be709f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 10:45:04 crc kubenswrapper[4776]: I1204 10:45:04.134377 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" event={"ID":"8031151f-715f-4639-b6f4-67a0d3be709f","Type":"ContainerDied","Data":"0b88fb1564863f6146df1899f4561b244061816a6ce59471a95501e2e7cc0f9d"} Dec 04 10:45:04 crc kubenswrapper[4776]: I1204 10:45:04.134427 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b88fb1564863f6146df1899f4561b244061816a6ce59471a95501e2e7cc0f9d" Dec 04 10:45:04 crc kubenswrapper[4776]: I1204 10:45:04.134490 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414085-k2cqx" Dec 04 10:45:04 crc kubenswrapper[4776]: I1204 10:45:04.613609 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8"] Dec 04 10:45:04 crc kubenswrapper[4776]: I1204 10:45:04.622900 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414040-7k8l8"] Dec 04 10:45:05 crc kubenswrapper[4776]: I1204 10:45:05.465019 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb6d2bf-b3df-46c7-8f0a-466805a68315" path="/var/lib/kubelet/pods/4bb6d2bf-b3df-46c7-8f0a-466805a68315/volumes" Dec 04 10:45:23 crc kubenswrapper[4776]: I1204 10:45:23.710236 4776 scope.go:117] "RemoveContainer" containerID="50084a6e5b0f1ff86902087de9ade4e5514b22c0c90933c58ceeef15b535194a" Dec 04 10:45:23 crc kubenswrapper[4776]: I1204 10:45:23.744308 4776 scope.go:117] "RemoveContainer" containerID="f579a5e78de540f59a32c794abc8fe627d2f20223870d2dfa6a19d403ad13c26" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.951723 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bgpj"] Dec 04 10:46:47 crc kubenswrapper[4776]: E1204 10:46:47.958768 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8031151f-715f-4639-b6f4-67a0d3be709f" containerName="collect-profiles" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.958803 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8031151f-715f-4639-b6f4-67a0d3be709f" containerName="collect-profiles" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.959131 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8031151f-715f-4639-b6f4-67a0d3be709f" containerName="collect-profiles" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.963975 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bgpj"] Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.964107 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.994824 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-catalog-content\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.994879 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntf6\" (UniqueName: \"kubernetes.io/projected/bc8b0ced-5068-46cd-8d45-d8bf3362de30-kube-api-access-nntf6\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:47 crc kubenswrapper[4776]: I1204 10:46:47.995081 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-utilities\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.097188 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-utilities\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.097757 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-catalog-content\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.097798 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntf6\" (UniqueName: \"kubernetes.io/projected/bc8b0ced-5068-46cd-8d45-d8bf3362de30-kube-api-access-nntf6\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.097936 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-utilities\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.098412 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-catalog-content\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.127585 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntf6\" (UniqueName: \"kubernetes.io/projected/bc8b0ced-5068-46cd-8d45-d8bf3362de30-kube-api-access-nntf6\") pod \"community-operators-6bgpj\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.287476 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:48 crc kubenswrapper[4776]: I1204 10:46:48.831656 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bgpj"] Dec 04 10:46:49 crc kubenswrapper[4776]: I1204 10:46:49.763180 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerID="015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f" exitCode=0 Dec 04 10:46:49 crc kubenswrapper[4776]: I1204 10:46:49.763239 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerDied","Data":"015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f"} Dec 04 10:46:49 crc kubenswrapper[4776]: I1204 10:46:49.763737 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerStarted","Data":"566faf0d93e4cb3a09afdd3a8f27f6fc3b162b611abc17cdfee1a909cff43b83"} Dec 04 10:46:49 crc kubenswrapper[4776]: I1204 10:46:49.767418 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:46:50 crc kubenswrapper[4776]: I1204 10:46:50.775130 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerStarted","Data":"fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae"} Dec 04 10:46:51 crc kubenswrapper[4776]: I1204 10:46:51.790004 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerID="fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae" exitCode=0 Dec 04 10:46:51 crc kubenswrapper[4776]: I1204 10:46:51.790171 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerDied","Data":"fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae"} Dec 04 10:46:52 crc kubenswrapper[4776]: I1204 10:46:52.800670 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerStarted","Data":"a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341"} Dec 04 10:46:52 crc kubenswrapper[4776]: I1204 10:46:52.828779 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bgpj" podStartSLOduration=3.409787406 podStartE2EDuration="5.828730745s" podCreationTimestamp="2025-12-04 10:46:47 +0000 UTC" firstStartedPulling="2025-12-04 10:46:49.767088001 +0000 UTC m=+4054.633568378" lastFinishedPulling="2025-12-04 10:46:52.18603134 +0000 UTC m=+4057.052511717" observedRunningTime="2025-12-04 10:46:52.822868561 +0000 UTC m=+4057.689348958" watchObservedRunningTime="2025-12-04 10:46:52.828730745 +0000 UTC m=+4057.695211122" Dec 04 10:46:58 crc kubenswrapper[4776]: I1204 10:46:58.288420 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:58 crc kubenswrapper[4776]: I1204 10:46:58.289149 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:58 crc kubenswrapper[4776]: I1204 10:46:58.893203 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:59 crc kubenswrapper[4776]: I1204 10:46:59.914186 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:46:59 crc kubenswrapper[4776]: I1204 10:46:59.970197 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bgpj"] Dec 04 10:47:01 crc kubenswrapper[4776]: I1204 10:47:01.897670 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6bgpj" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="registry-server" containerID="cri-o://a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341" gracePeriod=2 Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.504144 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.690254 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-utilities\") pod \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.690535 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-catalog-content\") pod \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.691249 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-utilities" (OuterVolumeSpecName: "utilities") pod "bc8b0ced-5068-46cd-8d45-d8bf3362de30" (UID: "bc8b0ced-5068-46cd-8d45-d8bf3362de30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.692148 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nntf6\" (UniqueName: \"kubernetes.io/projected/bc8b0ced-5068-46cd-8d45-d8bf3362de30-kube-api-access-nntf6\") pod \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\" (UID: \"bc8b0ced-5068-46cd-8d45-d8bf3362de30\") " Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.692783 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.696085 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b0ced-5068-46cd-8d45-d8bf3362de30-kube-api-access-nntf6" (OuterVolumeSpecName: "kube-api-access-nntf6") pod "bc8b0ced-5068-46cd-8d45-d8bf3362de30" (UID: "bc8b0ced-5068-46cd-8d45-d8bf3362de30"). InnerVolumeSpecName "kube-api-access-nntf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.743028 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc8b0ced-5068-46cd-8d45-d8bf3362de30" (UID: "bc8b0ced-5068-46cd-8d45-d8bf3362de30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.794494 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b0ced-5068-46cd-8d45-d8bf3362de30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.794532 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nntf6\" (UniqueName: \"kubernetes.io/projected/bc8b0ced-5068-46cd-8d45-d8bf3362de30-kube-api-access-nntf6\") on node \"crc\" DevicePath \"\"" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.919659 4776 generic.go:334] "Generic (PLEG): container finished" podID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerID="a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341" exitCode=0 Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.919740 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerDied","Data":"a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341"} Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.919785 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bgpj" event={"ID":"bc8b0ced-5068-46cd-8d45-d8bf3362de30","Type":"ContainerDied","Data":"566faf0d93e4cb3a09afdd3a8f27f6fc3b162b611abc17cdfee1a909cff43b83"} Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.919819 4776 scope.go:117] "RemoveContainer" containerID="a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.919850 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bgpj" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.962660 4776 scope.go:117] "RemoveContainer" containerID="fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae" Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.970875 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bgpj"] Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.980245 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6bgpj"] Dec 04 10:47:02 crc kubenswrapper[4776]: I1204 10:47:02.989157 4776 scope.go:117] "RemoveContainer" containerID="015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.146222 4776 scope.go:117] "RemoveContainer" containerID="a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341" Dec 04 10:47:03 crc kubenswrapper[4776]: E1204 10:47:03.149579 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341\": container with ID starting with a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341 not found: ID does not exist" containerID="a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.149639 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341"} err="failed to get container status \"a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341\": rpc error: code = NotFound desc = could not find container \"a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341\": container with ID starting with a20b878ce8b9dea823d33399bc18ecc30a7a7c86dc6797ac30e8ba6b3d767341 not found: ID does not exist" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.149689 4776 scope.go:117] "RemoveContainer" containerID="fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae" Dec 04 10:47:03 crc kubenswrapper[4776]: E1204 10:47:03.150342 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae\": container with ID starting with fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae not found: ID does not exist" containerID="fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.150398 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae"} err="failed to get container status \"fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae\": rpc error: code = NotFound desc = could not find container \"fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae\": container with ID starting with fdd045c52d698e3117fda2f9c81b420d859b992aad72c91d5b6ec7185dbdc5ae not found: ID does not exist" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.150436 4776 scope.go:117] "RemoveContainer" containerID="015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f" Dec 04 10:47:03 crc kubenswrapper[4776]: E1204 10:47:03.150909 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f\": container with ID starting with 015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f not found: ID does not exist" containerID="015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.150989 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f"} err="failed to get container status \"015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f\": rpc error: code = NotFound desc = could not find container \"015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f\": container with ID starting with 015282c8607bda35b6de7f35ce74d2ba61f654772f736c17301e2370e6d3092f not found: ID does not exist" Dec 04 10:47:03 crc kubenswrapper[4776]: I1204 10:47:03.465391 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" path="/var/lib/kubelet/pods/bc8b0ced-5068-46cd-8d45-d8bf3362de30/volumes" Dec 04 10:47:19 crc kubenswrapper[4776]: I1204 10:47:19.380283 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:47:19 crc kubenswrapper[4776]: I1204 10:47:19.381770 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:47:49 crc kubenswrapper[4776]: I1204 10:47:49.379584 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:47:49 crc kubenswrapper[4776]: I1204 10:47:49.380228 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.783658 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zs5dc"] Dec 04 10:48:16 crc kubenswrapper[4776]: E1204 10:48:16.784870 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="extract-content" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.784886 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="extract-content" Dec 04 10:48:16 crc kubenswrapper[4776]: E1204 10:48:16.784941 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="extract-utilities" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.784950 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="extract-utilities" Dec 04 10:48:16 crc kubenswrapper[4776]: E1204 10:48:16.784967 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="registry-server" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.784975 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="registry-server" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.785206 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8b0ced-5068-46cd-8d45-d8bf3362de30" containerName="registry-server" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.786921 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.797348 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs5dc"] Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.901191 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fncmh\" (UniqueName: \"kubernetes.io/projected/3ff99458-ea93-454e-918a-141737b5b6c0-kube-api-access-fncmh\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.901579 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-utilities\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:16 crc kubenswrapper[4776]: I1204 10:48:16.901799 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-catalog-content\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.003871 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-utilities\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.003955 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-catalog-content\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.004532 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-catalog-content\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.004583 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fncmh\" (UniqueName: \"kubernetes.io/projected/3ff99458-ea93-454e-918a-141737b5b6c0-kube-api-access-fncmh\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.004583 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-utilities\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.041450 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncmh\" (UniqueName: \"kubernetes.io/projected/3ff99458-ea93-454e-918a-141737b5b6c0-kube-api-access-fncmh\") pod \"redhat-operators-zs5dc\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.121016 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:17 crc kubenswrapper[4776]: I1204 10:48:17.630760 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs5dc"] Dec 04 10:48:18 crc kubenswrapper[4776]: I1204 10:48:18.643026 4776 generic.go:334] "Generic (PLEG): container finished" podID="3ff99458-ea93-454e-918a-141737b5b6c0" containerID="85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15" exitCode=0 Dec 04 10:48:18 crc kubenswrapper[4776]: I1204 10:48:18.643212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerDied","Data":"85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15"} Dec 04 10:48:18 crc kubenswrapper[4776]: I1204 10:48:18.643453 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerStarted","Data":"2d746aebaf839ce3d9371857f55855c7950767f26f18cdbc632afe374084b9e4"} Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.380467 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.381531 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.381606 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.382986 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de6ae4981b991fcd0800f65005c52451c9f035f8277040b04437c770c82be074"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.383119 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://de6ae4981b991fcd0800f65005c52451c9f035f8277040b04437c770c82be074" gracePeriod=600 Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.656069 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="de6ae4981b991fcd0800f65005c52451c9f035f8277040b04437c770c82be074" exitCode=0 Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.656101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"de6ae4981b991fcd0800f65005c52451c9f035f8277040b04437c770c82be074"} Dec 04 10:48:19 crc kubenswrapper[4776]: I1204 10:48:19.656628 4776 scope.go:117] "RemoveContainer" containerID="5d6a10f78bd187c89f34a3c668d78a422375cde3c309aabb67d785afaaccaab5" Dec 04 10:48:20 crc kubenswrapper[4776]: I1204 10:48:20.673755 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306"} Dec 04 10:48:24 crc kubenswrapper[4776]: I1204 10:48:24.736477 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerStarted","Data":"6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa"} Dec 04 10:48:25 crc kubenswrapper[4776]: I1204 10:48:25.749803 4776 generic.go:334] "Generic (PLEG): container finished" podID="3ff99458-ea93-454e-918a-141737b5b6c0" containerID="6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa" exitCode=0 Dec 04 10:48:25 crc kubenswrapper[4776]: I1204 10:48:25.749885 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerDied","Data":"6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa"} Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.220278 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwrkx"] Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.223593 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.247371 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwrkx"] Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.394156 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjg5p\" (UniqueName: \"kubernetes.io/projected/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-kube-api-access-bjg5p\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.394228 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-utilities\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.394461 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-catalog-content\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.496649 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-catalog-content\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.496819 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjg5p\" (UniqueName: \"kubernetes.io/projected/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-kube-api-access-bjg5p\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.496878 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-utilities\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.497302 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-catalog-content\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.497355 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-utilities\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.520026 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjg5p\" (UniqueName: \"kubernetes.io/projected/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-kube-api-access-bjg5p\") pod \"certified-operators-jwrkx\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.542800 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.827892 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zmtzc"] Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.830182 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.840711 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmtzc"] Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.920259 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-utilities\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.920319 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqbcl\" (UniqueName: \"kubernetes.io/projected/13b0a3a6-1dff-4990-82a1-d73a43286e8d-kube-api-access-bqbcl\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:29 crc kubenswrapper[4776]: I1204 10:48:29.920354 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-catalog-content\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.023246 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-utilities\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.023308 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqbcl\" (UniqueName: \"kubernetes.io/projected/13b0a3a6-1dff-4990-82a1-d73a43286e8d-kube-api-access-bqbcl\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.023335 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-catalog-content\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.023971 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-catalog-content\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.023973 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-utilities\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.044700 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqbcl\" (UniqueName: \"kubernetes.io/projected/13b0a3a6-1dff-4990-82a1-d73a43286e8d-kube-api-access-bqbcl\") pod \"redhat-marketplace-zmtzc\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.166025 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.181483 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwrkx"] Dec 04 10:48:30 crc kubenswrapper[4776]: I1204 10:48:30.803894 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerStarted","Data":"c1a4668ac570419aebcdab47fd0eef624cde4893a552f72aeb423d6bf5e174b0"} Dec 04 10:48:31 crc kubenswrapper[4776]: I1204 10:48:31.496796 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmtzc"] Dec 04 10:48:31 crc kubenswrapper[4776]: I1204 10:48:31.815013 4776 generic.go:334] "Generic (PLEG): container finished" podID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerID="3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3" exitCode=0 Dec 04 10:48:31 crc kubenswrapper[4776]: I1204 10:48:31.815197 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmtzc" event={"ID":"13b0a3a6-1dff-4990-82a1-d73a43286e8d","Type":"ContainerDied","Data":"3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3"} Dec 04 10:48:31 crc kubenswrapper[4776]: I1204 10:48:31.816227 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmtzc" event={"ID":"13b0a3a6-1dff-4990-82a1-d73a43286e8d","Type":"ContainerStarted","Data":"98948ddb0676d7e652b3d5c9053da12d7fdd7508a0bd4f1f3cf11686df237665"} Dec 04 10:48:31 crc kubenswrapper[4776]: I1204 10:48:31.819401 4776 generic.go:334] "Generic (PLEG): container finished" podID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerID="9b21669d12d3be8e1b049267302065e87c8cc87e3616d128fe6a695d7875b36e" exitCode=0 Dec 04 10:48:31 crc kubenswrapper[4776]: I1204 10:48:31.819464 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerDied","Data":"9b21669d12d3be8e1b049267302065e87c8cc87e3616d128fe6a695d7875b36e"} Dec 04 10:48:32 crc kubenswrapper[4776]: I1204 10:48:32.831216 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerStarted","Data":"ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614"} Dec 04 10:48:32 crc kubenswrapper[4776]: I1204 10:48:32.835105 4776 generic.go:334] "Generic (PLEG): container finished" podID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerID="1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd" exitCode=0 Dec 04 10:48:32 crc kubenswrapper[4776]: I1204 10:48:32.835315 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmtzc" event={"ID":"13b0a3a6-1dff-4990-82a1-d73a43286e8d","Type":"ContainerDied","Data":"1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd"} Dec 04 10:48:32 crc kubenswrapper[4776]: I1204 10:48:32.840350 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerStarted","Data":"9a8d64a14ef2251cf0be02c83339bbd2a9e98125ae832461fa2267b6678cc158"} Dec 04 10:48:32 crc kubenswrapper[4776]: I1204 10:48:32.858996 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zs5dc" podStartSLOduration=3.8687140060000003 podStartE2EDuration="16.85895932s" podCreationTimestamp="2025-12-04 10:48:16 +0000 UTC" firstStartedPulling="2025-12-04 10:48:18.645196329 +0000 UTC m=+4143.511676696" lastFinishedPulling="2025-12-04 10:48:31.635441633 +0000 UTC m=+4156.501922010" observedRunningTime="2025-12-04 10:48:32.851423223 +0000 UTC m=+4157.717903600" watchObservedRunningTime="2025-12-04 10:48:32.85895932 +0000 UTC m=+4157.725439697" Dec 04 10:48:34 crc kubenswrapper[4776]: I1204 10:48:34.867642 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmtzc" event={"ID":"13b0a3a6-1dff-4990-82a1-d73a43286e8d","Type":"ContainerStarted","Data":"32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216"} Dec 04 10:48:34 crc kubenswrapper[4776]: I1204 10:48:34.890360 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zmtzc" podStartSLOduration=4.488149131 podStartE2EDuration="5.890336652s" podCreationTimestamp="2025-12-04 10:48:29 +0000 UTC" firstStartedPulling="2025-12-04 10:48:31.817185924 +0000 UTC m=+4156.683666321" lastFinishedPulling="2025-12-04 10:48:33.219373465 +0000 UTC m=+4158.085853842" observedRunningTime="2025-12-04 10:48:34.889519996 +0000 UTC m=+4159.756000393" watchObservedRunningTime="2025-12-04 10:48:34.890336652 +0000 UTC m=+4159.756817029" Dec 04 10:48:36 crc kubenswrapper[4776]: I1204 10:48:36.889752 4776 generic.go:334] "Generic (PLEG): container finished" podID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerID="9a8d64a14ef2251cf0be02c83339bbd2a9e98125ae832461fa2267b6678cc158" exitCode=0 Dec 04 10:48:36 crc kubenswrapper[4776]: I1204 10:48:36.889838 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerDied","Data":"9a8d64a14ef2251cf0be02c83339bbd2a9e98125ae832461fa2267b6678cc158"} Dec 04 10:48:37 crc kubenswrapper[4776]: I1204 10:48:37.121910 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:37 crc kubenswrapper[4776]: I1204 10:48:37.122253 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:38 crc kubenswrapper[4776]: I1204 10:48:38.170693 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zs5dc" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="registry-server" probeResult="failure" output=< Dec 04 10:48:38 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 10:48:38 crc kubenswrapper[4776]: > Dec 04 10:48:39 crc kubenswrapper[4776]: I1204 10:48:39.945044 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerStarted","Data":"b57dfd1c8e3c54592b0b69ec4632bd8a0191f4041f24e33bdae61d3e5254f659"} Dec 04 10:48:39 crc kubenswrapper[4776]: I1204 10:48:39.979616 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwrkx" podStartSLOduration=3.742650034 podStartE2EDuration="10.979591521s" podCreationTimestamp="2025-12-04 10:48:29 +0000 UTC" firstStartedPulling="2025-12-04 10:48:31.822576373 +0000 UTC m=+4156.689056750" lastFinishedPulling="2025-12-04 10:48:39.05951787 +0000 UTC m=+4163.925998237" observedRunningTime="2025-12-04 10:48:39.969388701 +0000 UTC m=+4164.835869078" watchObservedRunningTime="2025-12-04 10:48:39.979591521 +0000 UTC m=+4164.846071908" Dec 04 10:48:40 crc kubenswrapper[4776]: I1204 10:48:40.166961 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:40 crc kubenswrapper[4776]: I1204 10:48:40.167353 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:40 crc kubenswrapper[4776]: I1204 10:48:40.221767 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:41 crc kubenswrapper[4776]: I1204 10:48:41.016547 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:42 crc kubenswrapper[4776]: I1204 10:48:42.610046 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmtzc"] Dec 04 10:48:42 crc kubenswrapper[4776]: I1204 10:48:42.972762 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zmtzc" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="registry-server" containerID="cri-o://32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216" gracePeriod=2 Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.594507 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.711474 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqbcl\" (UniqueName: \"kubernetes.io/projected/13b0a3a6-1dff-4990-82a1-d73a43286e8d-kube-api-access-bqbcl\") pod \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.711596 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-catalog-content\") pod \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.711751 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-utilities\") pod \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\" (UID: \"13b0a3a6-1dff-4990-82a1-d73a43286e8d\") " Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.718368 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-utilities" (OuterVolumeSpecName: "utilities") pod "13b0a3a6-1dff-4990-82a1-d73a43286e8d" (UID: "13b0a3a6-1dff-4990-82a1-d73a43286e8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.763937 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13b0a3a6-1dff-4990-82a1-d73a43286e8d" (UID: "13b0a3a6-1dff-4990-82a1-d73a43286e8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.764164 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b0a3a6-1dff-4990-82a1-d73a43286e8d-kube-api-access-bqbcl" (OuterVolumeSpecName: "kube-api-access-bqbcl") pod "13b0a3a6-1dff-4990-82a1-d73a43286e8d" (UID: "13b0a3a6-1dff-4990-82a1-d73a43286e8d"). InnerVolumeSpecName "kube-api-access-bqbcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.820017 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqbcl\" (UniqueName: \"kubernetes.io/projected/13b0a3a6-1dff-4990-82a1-d73a43286e8d-kube-api-access-bqbcl\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.820251 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.820329 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b0a3a6-1dff-4990-82a1-d73a43286e8d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.984484 4776 generic.go:334] "Generic (PLEG): container finished" podID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerID="32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216" exitCode=0 Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.984554 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmtzc" event={"ID":"13b0a3a6-1dff-4990-82a1-d73a43286e8d","Type":"ContainerDied","Data":"32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216"} Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.984591 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmtzc" event={"ID":"13b0a3a6-1dff-4990-82a1-d73a43286e8d","Type":"ContainerDied","Data":"98948ddb0676d7e652b3d5c9053da12d7fdd7508a0bd4f1f3cf11686df237665"} Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.984616 4776 scope.go:117] "RemoveContainer" containerID="32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216" Dec 04 10:48:43 crc kubenswrapper[4776]: I1204 10:48:43.984833 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmtzc" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.018282 4776 scope.go:117] "RemoveContainer" containerID="1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.026374 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmtzc"] Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.037028 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmtzc"] Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.044196 4776 scope.go:117] "RemoveContainer" containerID="3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.100750 4776 scope.go:117] "RemoveContainer" containerID="32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216" Dec 04 10:48:44 crc kubenswrapper[4776]: E1204 10:48:44.101674 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216\": container with ID starting with 32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216 not found: ID does not exist" containerID="32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.101776 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216"} err="failed to get container status \"32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216\": rpc error: code = NotFound desc = could not find container \"32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216\": container with ID starting with 32d67a96d870065a4080478dfc99ec88bb2153c1b70f62c56c08015d2f40f216 not found: ID does not exist" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.101837 4776 scope.go:117] "RemoveContainer" containerID="1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd" Dec 04 10:48:44 crc kubenswrapper[4776]: E1204 10:48:44.102460 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd\": container with ID starting with 1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd not found: ID does not exist" containerID="1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.102514 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd"} err="failed to get container status \"1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd\": rpc error: code = NotFound desc = could not find container \"1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd\": container with ID starting with 1a8872d4779d3e80642c0a1f39aa49e366d6956ec21851257d286de381889bfd not found: ID does not exist" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.102551 4776 scope.go:117] "RemoveContainer" containerID="3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3" Dec 04 10:48:44 crc kubenswrapper[4776]: E1204 10:48:44.102950 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3\": container with ID starting with 3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3 not found: ID does not exist" containerID="3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3" Dec 04 10:48:44 crc kubenswrapper[4776]: I1204 10:48:44.102986 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3"} err="failed to get container status \"3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3\": rpc error: code = NotFound desc = could not find container \"3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3\": container with ID starting with 3c117dde46d5b712a154bb42b91ec595def54eacc918c2715b0bf1e4fa8a91c3 not found: ID does not exist" Dec 04 10:48:45 crc kubenswrapper[4776]: I1204 10:48:45.464000 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" path="/var/lib/kubelet/pods/13b0a3a6-1dff-4990-82a1-d73a43286e8d/volumes" Dec 04 10:48:47 crc kubenswrapper[4776]: I1204 10:48:47.180839 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:47 crc kubenswrapper[4776]: I1204 10:48:47.239667 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:48 crc kubenswrapper[4776]: I1204 10:48:48.408318 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs5dc"] Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.037128 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zs5dc" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="registry-server" containerID="cri-o://ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614" gracePeriod=2 Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.543435 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.544058 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.555557 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.606361 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.705282 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-catalog-content\") pod \"3ff99458-ea93-454e-918a-141737b5b6c0\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.705728 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fncmh\" (UniqueName: \"kubernetes.io/projected/3ff99458-ea93-454e-918a-141737b5b6c0-kube-api-access-fncmh\") pod \"3ff99458-ea93-454e-918a-141737b5b6c0\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.705825 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-utilities\") pod \"3ff99458-ea93-454e-918a-141737b5b6c0\" (UID: \"3ff99458-ea93-454e-918a-141737b5b6c0\") " Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.706660 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-utilities" (OuterVolumeSpecName: "utilities") pod "3ff99458-ea93-454e-918a-141737b5b6c0" (UID: "3ff99458-ea93-454e-918a-141737b5b6c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.729826 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff99458-ea93-454e-918a-141737b5b6c0-kube-api-access-fncmh" (OuterVolumeSpecName: "kube-api-access-fncmh") pod "3ff99458-ea93-454e-918a-141737b5b6c0" (UID: "3ff99458-ea93-454e-918a-141737b5b6c0"). InnerVolumeSpecName "kube-api-access-fncmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.808627 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fncmh\" (UniqueName: \"kubernetes.io/projected/3ff99458-ea93-454e-918a-141737b5b6c0-kube-api-access-fncmh\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.808675 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.823381 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ff99458-ea93-454e-918a-141737b5b6c0" (UID: "3ff99458-ea93-454e-918a-141737b5b6c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:49 crc kubenswrapper[4776]: I1204 10:48:49.910948 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff99458-ea93-454e-918a-141737b5b6c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.049975 4776 generic.go:334] "Generic (PLEG): container finished" podID="3ff99458-ea93-454e-918a-141737b5b6c0" containerID="ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614" exitCode=0 Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.050156 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5dc" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.050209 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerDied","Data":"ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614"} Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.050253 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5dc" event={"ID":"3ff99458-ea93-454e-918a-141737b5b6c0","Type":"ContainerDied","Data":"2d746aebaf839ce3d9371857f55855c7950767f26f18cdbc632afe374084b9e4"} Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.050291 4776 scope.go:117] "RemoveContainer" containerID="ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.072613 4776 scope.go:117] "RemoveContainer" containerID="6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.090605 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs5dc"] Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.104132 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zs5dc"] Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.111117 4776 scope.go:117] "RemoveContainer" containerID="85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.116748 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.156578 4776 scope.go:117] "RemoveContainer" containerID="ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614" Dec 04 10:48:50 crc kubenswrapper[4776]: E1204 10:48:50.158793 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614\": container with ID starting with ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614 not found: ID does not exist" containerID="ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.158837 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614"} err="failed to get container status \"ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614\": rpc error: code = NotFound desc = could not find container \"ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614\": container with ID starting with ac0dff11952e235bdf0fa0def4776b5fff71952b298043b9cac1a73a28167614 not found: ID does not exist" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.158864 4776 scope.go:117] "RemoveContainer" containerID="6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa" Dec 04 10:48:50 crc kubenswrapper[4776]: E1204 10:48:50.160067 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa\": container with ID starting with 6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa not found: ID does not exist" containerID="6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.160113 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa"} err="failed to get container status \"6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa\": rpc error: code = NotFound desc = could not find container \"6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa\": container with ID starting with 6145ad53f621265a8bc3025589b6323930df6ed4b35af91e45d903a9c3caa8fa not found: ID does not exist" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.160150 4776 scope.go:117] "RemoveContainer" containerID="85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15" Dec 04 10:48:50 crc kubenswrapper[4776]: E1204 10:48:50.160728 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15\": container with ID starting with 85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15 not found: ID does not exist" containerID="85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15" Dec 04 10:48:50 crc kubenswrapper[4776]: I1204 10:48:50.160767 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15"} err="failed to get container status \"85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15\": rpc error: code = NotFound desc = could not find container \"85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15\": container with ID starting with 85000563e6b65500686b85936d4beee1c76754fef6d3c2099e11ed5a7792ab15 not found: ID does not exist" Dec 04 10:48:51 crc kubenswrapper[4776]: I1204 10:48:51.476826 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" path="/var/lib/kubelet/pods/3ff99458-ea93-454e-918a-141737b5b6c0/volumes" Dec 04 10:48:52 crc kubenswrapper[4776]: I1204 10:48:52.020195 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwrkx"] Dec 04 10:48:52 crc kubenswrapper[4776]: I1204 10:48:52.070818 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwrkx" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="registry-server" containerID="cri-o://b57dfd1c8e3c54592b0b69ec4632bd8a0191f4041f24e33bdae61d3e5254f659" gracePeriod=2 Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.087422 4776 generic.go:334] "Generic (PLEG): container finished" podID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerID="b57dfd1c8e3c54592b0b69ec4632bd8a0191f4041f24e33bdae61d3e5254f659" exitCode=0 Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.087749 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerDied","Data":"b57dfd1c8e3c54592b0b69ec4632bd8a0191f4041f24e33bdae61d3e5254f659"} Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.350704 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.488320 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjg5p\" (UniqueName: \"kubernetes.io/projected/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-kube-api-access-bjg5p\") pod \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.488451 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-utilities\") pod \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.488555 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-catalog-content\") pod \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\" (UID: \"93218ae5-7f9f-4e4e-b79d-3088fe0b3597\") " Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.489520 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-utilities" (OuterVolumeSpecName: "utilities") pod "93218ae5-7f9f-4e4e-b79d-3088fe0b3597" (UID: "93218ae5-7f9f-4e4e-b79d-3088fe0b3597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.512160 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-kube-api-access-bjg5p" (OuterVolumeSpecName: "kube-api-access-bjg5p") pod "93218ae5-7f9f-4e4e-b79d-3088fe0b3597" (UID: "93218ae5-7f9f-4e4e-b79d-3088fe0b3597"). InnerVolumeSpecName "kube-api-access-bjg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.538942 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93218ae5-7f9f-4e4e-b79d-3088fe0b3597" (UID: "93218ae5-7f9f-4e4e-b79d-3088fe0b3597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.591398 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjg5p\" (UniqueName: \"kubernetes.io/projected/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-kube-api-access-bjg5p\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.591441 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:53 crc kubenswrapper[4776]: I1204 10:48:53.591454 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93218ae5-7f9f-4e4e-b79d-3088fe0b3597-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.121812 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwrkx" event={"ID":"93218ae5-7f9f-4e4e-b79d-3088fe0b3597","Type":"ContainerDied","Data":"c1a4668ac570419aebcdab47fd0eef624cde4893a552f72aeb423d6bf5e174b0"} Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.121883 4776 scope.go:117] "RemoveContainer" containerID="b57dfd1c8e3c54592b0b69ec4632bd8a0191f4041f24e33bdae61d3e5254f659" Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.121900 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwrkx" Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.164469 4776 scope.go:117] "RemoveContainer" containerID="9a8d64a14ef2251cf0be02c83339bbd2a9e98125ae832461fa2267b6678cc158" Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.171298 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwrkx"] Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.222321 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwrkx"] Dec 04 10:48:54 crc kubenswrapper[4776]: I1204 10:48:54.280250 4776 scope.go:117] "RemoveContainer" containerID="9b21669d12d3be8e1b049267302065e87c8cc87e3616d128fe6a695d7875b36e" Dec 04 10:48:55 crc kubenswrapper[4776]: I1204 10:48:55.463731 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" path="/var/lib/kubelet/pods/93218ae5-7f9f-4e4e-b79d-3088fe0b3597/volumes" Dec 04 10:50:19 crc kubenswrapper[4776]: I1204 10:50:19.381014 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:50:19 crc kubenswrapper[4776]: I1204 10:50:19.381680 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:50:49 crc kubenswrapper[4776]: I1204 10:50:49.380301 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:50:49 crc kubenswrapper[4776]: I1204 10:50:49.381103 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:51:19 crc kubenswrapper[4776]: I1204 10:51:19.380559 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:51:19 crc kubenswrapper[4776]: I1204 10:51:19.381203 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:51:19 crc kubenswrapper[4776]: I1204 10:51:19.381263 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:51:19 crc kubenswrapper[4776]: I1204 10:51:19.382141 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:51:19 crc kubenswrapper[4776]: I1204 10:51:19.382207 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" gracePeriod=600 Dec 04 10:51:19 crc kubenswrapper[4776]: E1204 10:51:19.539694 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:51:20 crc kubenswrapper[4776]: I1204 10:51:20.373136 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" exitCode=0 Dec 04 10:51:20 crc kubenswrapper[4776]: I1204 10:51:20.373186 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306"} Dec 04 10:51:20 crc kubenswrapper[4776]: I1204 10:51:20.373233 4776 scope.go:117] "RemoveContainer" containerID="de6ae4981b991fcd0800f65005c52451c9f035f8277040b04437c770c82be074" Dec 04 10:51:20 crc kubenswrapper[4776]: I1204 10:51:20.373818 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:51:20 crc kubenswrapper[4776]: E1204 10:51:20.374103 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:51:31 crc kubenswrapper[4776]: I1204 10:51:31.452442 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:51:31 crc kubenswrapper[4776]: E1204 10:51:31.453509 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:51:43 crc kubenswrapper[4776]: I1204 10:51:43.452441 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:51:43 crc kubenswrapper[4776]: E1204 10:51:43.453273 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:51:48 crc kubenswrapper[4776]: I1204 10:51:48.702646 4776 generic.go:334] "Generic (PLEG): container finished" podID="5e42f4d6-4793-4568-9a55-4d346b39dbac" containerID="8f847996b183a6b4849fb229ff052b1260206402612e6909f20dacf5262f7778" exitCode=0 Dec 04 10:51:48 crc kubenswrapper[4776]: I1204 10:51:48.702687 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e42f4d6-4793-4568-9a55-4d346b39dbac","Type":"ContainerDied","Data":"8f847996b183a6b4849fb229ff052b1260206402612e6909f20dacf5262f7778"} Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.051314 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194313 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194391 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194427 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-config-data\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194529 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config-secret\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194566 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ca-certs\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194598 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-workdir\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194646 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ssh-key\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194672 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgvbx\" (UniqueName: \"kubernetes.io/projected/5e42f4d6-4793-4568-9a55-4d346b39dbac-kube-api-access-jgvbx\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.194703 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-temporary\") pod \"5e42f4d6-4793-4568-9a55-4d346b39dbac\" (UID: \"5e42f4d6-4793-4568-9a55-4d346b39dbac\") " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.195446 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-config-data" (OuterVolumeSpecName: "config-data") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.195720 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.199373 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.206036 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e42f4d6-4793-4568-9a55-4d346b39dbac-kube-api-access-jgvbx" (OuterVolumeSpecName: "kube-api-access-jgvbx") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "kube-api-access-jgvbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.219659 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.227644 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.232810 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.240613 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.265784 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5e42f4d6-4793-4568-9a55-4d346b39dbac" (UID: "5e42f4d6-4793-4568-9a55-4d346b39dbac"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296633 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296667 4776 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296680 4776 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296689 4776 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42f4d6-4793-4568-9a55-4d346b39dbac-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296700 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgvbx\" (UniqueName: \"kubernetes.io/projected/5e42f4d6-4793-4568-9a55-4d346b39dbac-kube-api-access-jgvbx\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296711 4776 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e42f4d6-4793-4568-9a55-4d346b39dbac-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296745 4776 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296762 4776 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.296776 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e42f4d6-4793-4568-9a55-4d346b39dbac-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.315892 4776 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.398635 4776 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.721560 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e42f4d6-4793-4568-9a55-4d346b39dbac","Type":"ContainerDied","Data":"a5627ffccb47c4c0c083e3377a9056f9b22ac438a233a571be3d497578f4162f"} Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.721598 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5627ffccb47c4c0c083e3377a9056f9b22ac438a233a571be3d497578f4162f" Dec 04 10:51:50 crc kubenswrapper[4776]: I1204 10:51:50.722029 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 10:51:54 crc kubenswrapper[4776]: I1204 10:51:54.452776 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:51:54 crc kubenswrapper[4776]: E1204 10:51:54.453749 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.128184 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129075 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="extract-content" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129089 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="extract-content" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129105 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="extract-content" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129111 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="extract-content" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129120 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e42f4d6-4793-4568-9a55-4d346b39dbac" containerName="tempest-tests-tempest-tests-runner" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129127 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e42f4d6-4793-4568-9a55-4d346b39dbac" containerName="tempest-tests-tempest-tests-runner" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129137 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129142 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129153 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129159 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129168 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129174 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129190 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="extract-utilities" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129196 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="extract-utilities" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129214 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="extract-content" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129221 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="extract-content" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129246 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="extract-utilities" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129255 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="extract-utilities" Dec 04 10:52:02 crc kubenswrapper[4776]: E1204 10:52:02.129267 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="extract-utilities" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129275 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="extract-utilities" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129461 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="93218ae5-7f9f-4e4e-b79d-3088fe0b3597" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129478 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e42f4d6-4793-4568-9a55-4d346b39dbac" containerName="tempest-tests-tempest-tests-runner" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129489 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff99458-ea93-454e-918a-141737b5b6c0" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.129507 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b0a3a6-1dff-4990-82a1-d73a43286e8d" containerName="registry-server" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.130166 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.140021 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rjhhj" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.144909 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.271914 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.272183 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgdv\" (UniqueName: \"kubernetes.io/projected/0c9e28cc-5a44-4315-a018-c0678bc68347-kube-api-access-7lgdv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.374377 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgdv\" (UniqueName: \"kubernetes.io/projected/0c9e28cc-5a44-4315-a018-c0678bc68347-kube-api-access-7lgdv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.374519 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.375066 4776 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.392968 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgdv\" (UniqueName: \"kubernetes.io/projected/0c9e28cc-5a44-4315-a018-c0678bc68347-kube-api-access-7lgdv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.398322 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0c9e28cc-5a44-4315-a018-c0678bc68347\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.465536 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.902947 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 10:52:02 crc kubenswrapper[4776]: I1204 10:52:02.906015 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:52:03 crc kubenswrapper[4776]: I1204 10:52:03.858784 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0c9e28cc-5a44-4315-a018-c0678bc68347","Type":"ContainerStarted","Data":"3cb403d47c45daea74ca41f39c9477c54a83c38f31166eda161a5d5c0c0d612c"} Dec 04 10:52:03 crc kubenswrapper[4776]: I1204 10:52:03.859241 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0c9e28cc-5a44-4315-a018-c0678bc68347","Type":"ContainerStarted","Data":"5dfe5e3ec7cd249c43362e72f783e142453bd6496fde66147f25e59fca520f10"} Dec 04 10:52:03 crc kubenswrapper[4776]: I1204 10:52:03.875091 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.150926236 podStartE2EDuration="1.875060305s" podCreationTimestamp="2025-12-04 10:52:02 +0000 UTC" firstStartedPulling="2025-12-04 10:52:02.905663725 +0000 UTC m=+4367.772144112" lastFinishedPulling="2025-12-04 10:52:03.629797804 +0000 UTC m=+4368.496278181" observedRunningTime="2025-12-04 10:52:03.872071191 +0000 UTC m=+4368.738551558" watchObservedRunningTime="2025-12-04 10:52:03.875060305 +0000 UTC m=+4368.741540692" Dec 04 10:52:07 crc kubenswrapper[4776]: I1204 10:52:07.453675 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:52:07 crc kubenswrapper[4776]: E1204 10:52:07.454602 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:52:20 crc kubenswrapper[4776]: I1204 10:52:20.452899 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:52:20 crc kubenswrapper[4776]: E1204 10:52:20.454852 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.531193 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99q5c/must-gather-zf8bk"] Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.534590 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.541634 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-99q5c"/"openshift-service-ca.crt" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.541752 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-99q5c"/"default-dockercfg-rlqwr" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.541880 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-99q5c/must-gather-zf8bk"] Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.542388 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-99q5c"/"kube-root-ca.crt" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.669050 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-must-gather-output\") pod \"must-gather-zf8bk\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.669442 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdvk\" (UniqueName: \"kubernetes.io/projected/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-kube-api-access-2pdvk\") pod \"must-gather-zf8bk\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.771175 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdvk\" (UniqueName: \"kubernetes.io/projected/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-kube-api-access-2pdvk\") pod \"must-gather-zf8bk\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.771351 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-must-gather-output\") pod \"must-gather-zf8bk\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.771873 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-must-gather-output\") pod \"must-gather-zf8bk\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.793653 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdvk\" (UniqueName: \"kubernetes.io/projected/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-kube-api-access-2pdvk\") pod \"must-gather-zf8bk\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:29 crc kubenswrapper[4776]: I1204 10:52:29.857784 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:52:30 crc kubenswrapper[4776]: I1204 10:52:30.371534 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-99q5c/must-gather-zf8bk"] Dec 04 10:52:31 crc kubenswrapper[4776]: I1204 10:52:31.113111 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/must-gather-zf8bk" event={"ID":"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320","Type":"ContainerStarted","Data":"ab0c3e2a8745ff61dfd1c63a3db9e8d3252fa58603cff5f69c7b8ae580e45ebb"} Dec 04 10:52:35 crc kubenswrapper[4776]: I1204 10:52:35.155568 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/must-gather-zf8bk" event={"ID":"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320","Type":"ContainerStarted","Data":"4373f7be7df1dd0b1c1e3d6c90c675f720c801ac7c4e240265517fe16db63164"} Dec 04 10:52:35 crc kubenswrapper[4776]: I1204 10:52:35.156702 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/must-gather-zf8bk" event={"ID":"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320","Type":"ContainerStarted","Data":"f6efb918dc55608406785850c34a8b7304e0a7a9791ab8fbd7d4e24ea7742e31"} Dec 04 10:52:35 crc kubenswrapper[4776]: I1204 10:52:35.170856 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-99q5c/must-gather-zf8bk" podStartSLOduration=2.578671316 podStartE2EDuration="6.170834926s" podCreationTimestamp="2025-12-04 10:52:29 +0000 UTC" firstStartedPulling="2025-12-04 10:52:30.3750193 +0000 UTC m=+4395.241499677" lastFinishedPulling="2025-12-04 10:52:33.96718291 +0000 UTC m=+4398.833663287" observedRunningTime="2025-12-04 10:52:35.169352489 +0000 UTC m=+4400.035832876" watchObservedRunningTime="2025-12-04 10:52:35.170834926 +0000 UTC m=+4400.037315303" Dec 04 10:52:35 crc kubenswrapper[4776]: I1204 10:52:35.458742 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:52:35 crc kubenswrapper[4776]: E1204 10:52:35.459029 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.238387 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99q5c/crc-debug-27f7c"] Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.240626 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.373634 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thtsg\" (UniqueName: \"kubernetes.io/projected/1972c77f-6b0b-4829-83d6-90f6d70991dc-kube-api-access-thtsg\") pod \"crc-debug-27f7c\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.373864 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1972c77f-6b0b-4829-83d6-90f6d70991dc-host\") pod \"crc-debug-27f7c\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.475576 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thtsg\" (UniqueName: \"kubernetes.io/projected/1972c77f-6b0b-4829-83d6-90f6d70991dc-kube-api-access-thtsg\") pod \"crc-debug-27f7c\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.475712 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1972c77f-6b0b-4829-83d6-90f6d70991dc-host\") pod \"crc-debug-27f7c\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.475902 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1972c77f-6b0b-4829-83d6-90f6d70991dc-host\") pod \"crc-debug-27f7c\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.499868 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thtsg\" (UniqueName: \"kubernetes.io/projected/1972c77f-6b0b-4829-83d6-90f6d70991dc-kube-api-access-thtsg\") pod \"crc-debug-27f7c\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:38 crc kubenswrapper[4776]: I1204 10:52:38.561665 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:52:39 crc kubenswrapper[4776]: I1204 10:52:39.201565 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-27f7c" event={"ID":"1972c77f-6b0b-4829-83d6-90f6d70991dc","Type":"ContainerStarted","Data":"79dc5db942f2fb4d2125bdf2be9a048734e01bbd826ed1bb76da5c53dbd76912"} Dec 04 10:52:48 crc kubenswrapper[4776]: I1204 10:52:48.452295 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:52:48 crc kubenswrapper[4776]: E1204 10:52:48.453236 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:52:50 crc kubenswrapper[4776]: I1204 10:52:50.306513 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-27f7c" event={"ID":"1972c77f-6b0b-4829-83d6-90f6d70991dc","Type":"ContainerStarted","Data":"d2fcd6c26c37fcef090120e5e9bb964361193a11c773b9e7c6cb2b428cd67a19"} Dec 04 10:53:02 crc kubenswrapper[4776]: I1204 10:53:02.452031 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:53:02 crc kubenswrapper[4776]: E1204 10:53:02.452767 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:53:17 crc kubenswrapper[4776]: I1204 10:53:17.452192 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:53:17 crc kubenswrapper[4776]: E1204 10:53:17.454392 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:53:28 crc kubenswrapper[4776]: I1204 10:53:28.452090 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:53:28 crc kubenswrapper[4776]: E1204 10:53:28.452721 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:53:43 crc kubenswrapper[4776]: I1204 10:53:43.453100 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:53:43 crc kubenswrapper[4776]: E1204 10:53:43.453988 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:53:57 crc kubenswrapper[4776]: I1204 10:53:57.453414 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:53:57 crc kubenswrapper[4776]: E1204 10:53:57.453995 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:53:59 crc kubenswrapper[4776]: I1204 10:53:59.952484 4776 generic.go:334] "Generic (PLEG): container finished" podID="1972c77f-6b0b-4829-83d6-90f6d70991dc" containerID="d2fcd6c26c37fcef090120e5e9bb964361193a11c773b9e7c6cb2b428cd67a19" exitCode=0 Dec 04 10:53:59 crc kubenswrapper[4776]: I1204 10:53:59.952585 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-27f7c" event={"ID":"1972c77f-6b0b-4829-83d6-90f6d70991dc","Type":"ContainerDied","Data":"d2fcd6c26c37fcef090120e5e9bb964361193a11c773b9e7c6cb2b428cd67a19"} Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.095931 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.140182 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99q5c/crc-debug-27f7c"] Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.147659 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99q5c/crc-debug-27f7c"] Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.161063 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1972c77f-6b0b-4829-83d6-90f6d70991dc-host\") pod \"1972c77f-6b0b-4829-83d6-90f6d70991dc\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.161297 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thtsg\" (UniqueName: \"kubernetes.io/projected/1972c77f-6b0b-4829-83d6-90f6d70991dc-kube-api-access-thtsg\") pod \"1972c77f-6b0b-4829-83d6-90f6d70991dc\" (UID: \"1972c77f-6b0b-4829-83d6-90f6d70991dc\") " Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.161471 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1972c77f-6b0b-4829-83d6-90f6d70991dc-host" (OuterVolumeSpecName: "host") pod "1972c77f-6b0b-4829-83d6-90f6d70991dc" (UID: "1972c77f-6b0b-4829-83d6-90f6d70991dc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.162119 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1972c77f-6b0b-4829-83d6-90f6d70991dc-host\") on node \"crc\" DevicePath \"\"" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.166491 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1972c77f-6b0b-4829-83d6-90f6d70991dc-kube-api-access-thtsg" (OuterVolumeSpecName: "kube-api-access-thtsg") pod "1972c77f-6b0b-4829-83d6-90f6d70991dc" (UID: "1972c77f-6b0b-4829-83d6-90f6d70991dc"). InnerVolumeSpecName "kube-api-access-thtsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.264215 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thtsg\" (UniqueName: \"kubernetes.io/projected/1972c77f-6b0b-4829-83d6-90f6d70991dc-kube-api-access-thtsg\") on node \"crc\" DevicePath \"\"" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.461854 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1972c77f-6b0b-4829-83d6-90f6d70991dc" path="/var/lib/kubelet/pods/1972c77f-6b0b-4829-83d6-90f6d70991dc/volumes" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.974262 4776 scope.go:117] "RemoveContainer" containerID="d2fcd6c26c37fcef090120e5e9bb964361193a11c773b9e7c6cb2b428cd67a19" Dec 04 10:54:01 crc kubenswrapper[4776]: I1204 10:54:01.974318 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-27f7c" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.294687 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99q5c/crc-debug-bqj6v"] Dec 04 10:54:02 crc kubenswrapper[4776]: E1204 10:54:02.295059 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1972c77f-6b0b-4829-83d6-90f6d70991dc" containerName="container-00" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.295072 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1972c77f-6b0b-4829-83d6-90f6d70991dc" containerName="container-00" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.295284 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1972c77f-6b0b-4829-83d6-90f6d70991dc" containerName="container-00" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.295890 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.385069 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3827479-8df0-453e-95b6-a152efdd0227-host\") pod \"crc-debug-bqj6v\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.385391 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g96br\" (UniqueName: \"kubernetes.io/projected/e3827479-8df0-453e-95b6-a152efdd0227-kube-api-access-g96br\") pod \"crc-debug-bqj6v\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.487726 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3827479-8df0-453e-95b6-a152efdd0227-host\") pod \"crc-debug-bqj6v\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.487962 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3827479-8df0-453e-95b6-a152efdd0227-host\") pod \"crc-debug-bqj6v\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.488107 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g96br\" (UniqueName: \"kubernetes.io/projected/e3827479-8df0-453e-95b6-a152efdd0227-kube-api-access-g96br\") pod \"crc-debug-bqj6v\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.514094 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g96br\" (UniqueName: \"kubernetes.io/projected/e3827479-8df0-453e-95b6-a152efdd0227-kube-api-access-g96br\") pod \"crc-debug-bqj6v\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.615145 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.983322 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" event={"ID":"e3827479-8df0-453e-95b6-a152efdd0227","Type":"ContainerStarted","Data":"67f9d07710d4a0b4c5f010f5fdf8d959cdcd21ed25cc20b1ab0e048e0d087d88"} Dec 04 10:54:02 crc kubenswrapper[4776]: I1204 10:54:02.983366 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" event={"ID":"e3827479-8df0-453e-95b6-a152efdd0227","Type":"ContainerStarted","Data":"c766eb4b88b91e442ab7fff1a8d02fa0497ae78ca78f82c28c04689dc9e96eda"} Dec 04 10:54:03 crc kubenswrapper[4776]: I1204 10:54:03.001514 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" podStartSLOduration=1.001493635 podStartE2EDuration="1.001493635s" podCreationTimestamp="2025-12-04 10:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 10:54:02.99529763 +0000 UTC m=+4487.861778007" watchObservedRunningTime="2025-12-04 10:54:03.001493635 +0000 UTC m=+4487.867974032" Dec 04 10:54:04 crc kubenswrapper[4776]: I1204 10:54:04.005820 4776 generic.go:334] "Generic (PLEG): container finished" podID="e3827479-8df0-453e-95b6-a152efdd0227" containerID="67f9d07710d4a0b4c5f010f5fdf8d959cdcd21ed25cc20b1ab0e048e0d087d88" exitCode=0 Dec 04 10:54:04 crc kubenswrapper[4776]: I1204 10:54:04.006220 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" event={"ID":"e3827479-8df0-453e-95b6-a152efdd0227","Type":"ContainerDied","Data":"67f9d07710d4a0b4c5f010f5fdf8d959cdcd21ed25cc20b1ab0e048e0d087d88"} Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.110590 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.148471 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99q5c/crc-debug-bqj6v"] Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.157223 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99q5c/crc-debug-bqj6v"] Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.249333 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g96br\" (UniqueName: \"kubernetes.io/projected/e3827479-8df0-453e-95b6-a152efdd0227-kube-api-access-g96br\") pod \"e3827479-8df0-453e-95b6-a152efdd0227\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.249872 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3827479-8df0-453e-95b6-a152efdd0227-host\") pod \"e3827479-8df0-453e-95b6-a152efdd0227\" (UID: \"e3827479-8df0-453e-95b6-a152efdd0227\") " Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.249943 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3827479-8df0-453e-95b6-a152efdd0227-host" (OuterVolumeSpecName: "host") pod "e3827479-8df0-453e-95b6-a152efdd0227" (UID: "e3827479-8df0-453e-95b6-a152efdd0227"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.250754 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3827479-8df0-453e-95b6-a152efdd0227-host\") on node \"crc\" DevicePath \"\"" Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.258120 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3827479-8df0-453e-95b6-a152efdd0227-kube-api-access-g96br" (OuterVolumeSpecName: "kube-api-access-g96br") pod "e3827479-8df0-453e-95b6-a152efdd0227" (UID: "e3827479-8df0-453e-95b6-a152efdd0227"). InnerVolumeSpecName "kube-api-access-g96br". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.352250 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g96br\" (UniqueName: \"kubernetes.io/projected/e3827479-8df0-453e-95b6-a152efdd0227-kube-api-access-g96br\") on node \"crc\" DevicePath \"\"" Dec 04 10:54:05 crc kubenswrapper[4776]: I1204 10:54:05.461986 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3827479-8df0-453e-95b6-a152efdd0227" path="/var/lib/kubelet/pods/e3827479-8df0-453e-95b6-a152efdd0227/volumes" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.021594 4776 scope.go:117] "RemoveContainer" containerID="67f9d07710d4a0b4c5f010f5fdf8d959cdcd21ed25cc20b1ab0e048e0d087d88" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.021662 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-bqj6v" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.306609 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99q5c/crc-debug-gtl9n"] Dec 04 10:54:06 crc kubenswrapper[4776]: E1204 10:54:06.307410 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3827479-8df0-453e-95b6-a152efdd0227" containerName="container-00" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.307428 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3827479-8df0-453e-95b6-a152efdd0227" containerName="container-00" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.307653 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3827479-8df0-453e-95b6-a152efdd0227" containerName="container-00" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.308509 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.374011 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmkn\" (UniqueName: \"kubernetes.io/projected/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-kube-api-access-wcmkn\") pod \"crc-debug-gtl9n\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.374140 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-host\") pod \"crc-debug-gtl9n\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.475579 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmkn\" (UniqueName: \"kubernetes.io/projected/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-kube-api-access-wcmkn\") pod \"crc-debug-gtl9n\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.476212 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-host\") pod \"crc-debug-gtl9n\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.476293 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-host\") pod \"crc-debug-gtl9n\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.508143 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmkn\" (UniqueName: \"kubernetes.io/projected/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-kube-api-access-wcmkn\") pod \"crc-debug-gtl9n\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: I1204 10:54:06.635773 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:06 crc kubenswrapper[4776]: W1204 10:54:06.683451 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bfe5f4_2080_4eb4_8a2e_b38600e7af04.slice/crio-1d88261c25081330ac8bd172fbe8620c3d02442b52dc5c790d1f89883c2203ec WatchSource:0}: Error finding container 1d88261c25081330ac8bd172fbe8620c3d02442b52dc5c790d1f89883c2203ec: Status 404 returned error can't find the container with id 1d88261c25081330ac8bd172fbe8620c3d02442b52dc5c790d1f89883c2203ec Dec 04 10:54:07 crc kubenswrapper[4776]: I1204 10:54:07.032453 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-gtl9n" event={"ID":"84bfe5f4-2080-4eb4-8a2e-b38600e7af04","Type":"ContainerStarted","Data":"1d88261c25081330ac8bd172fbe8620c3d02442b52dc5c790d1f89883c2203ec"} Dec 04 10:54:08 crc kubenswrapper[4776]: I1204 10:54:08.063690 4776 generic.go:334] "Generic (PLEG): container finished" podID="84bfe5f4-2080-4eb4-8a2e-b38600e7af04" containerID="f0569ce9b759f50633b04e0ac9e2552a019fe5d729a95e007c70f63d5b2351e7" exitCode=0 Dec 04 10:54:08 crc kubenswrapper[4776]: I1204 10:54:08.065320 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/crc-debug-gtl9n" event={"ID":"84bfe5f4-2080-4eb4-8a2e-b38600e7af04","Type":"ContainerDied","Data":"f0569ce9b759f50633b04e0ac9e2552a019fe5d729a95e007c70f63d5b2351e7"} Dec 04 10:54:08 crc kubenswrapper[4776]: I1204 10:54:08.114254 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99q5c/crc-debug-gtl9n"] Dec 04 10:54:08 crc kubenswrapper[4776]: I1204 10:54:08.122291 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99q5c/crc-debug-gtl9n"] Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.170165 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.238419 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcmkn\" (UniqueName: \"kubernetes.io/projected/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-kube-api-access-wcmkn\") pod \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.238677 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-host\") pod \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\" (UID: \"84bfe5f4-2080-4eb4-8a2e-b38600e7af04\") " Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.238784 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-host" (OuterVolumeSpecName: "host") pod "84bfe5f4-2080-4eb4-8a2e-b38600e7af04" (UID: "84bfe5f4-2080-4eb4-8a2e-b38600e7af04"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.239665 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-host\") on node \"crc\" DevicePath \"\"" Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.253377 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-kube-api-access-wcmkn" (OuterVolumeSpecName: "kube-api-access-wcmkn") pod "84bfe5f4-2080-4eb4-8a2e-b38600e7af04" (UID: "84bfe5f4-2080-4eb4-8a2e-b38600e7af04"). InnerVolumeSpecName "kube-api-access-wcmkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.340986 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcmkn\" (UniqueName: \"kubernetes.io/projected/84bfe5f4-2080-4eb4-8a2e-b38600e7af04-kube-api-access-wcmkn\") on node \"crc\" DevicePath \"\"" Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.452905 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:54:09 crc kubenswrapper[4776]: E1204 10:54:09.453185 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:54:09 crc kubenswrapper[4776]: I1204 10:54:09.468526 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bfe5f4-2080-4eb4-8a2e-b38600e7af04" path="/var/lib/kubelet/pods/84bfe5f4-2080-4eb4-8a2e-b38600e7af04/volumes" Dec 04 10:54:10 crc kubenswrapper[4776]: I1204 10:54:10.081529 4776 scope.go:117] "RemoveContainer" containerID="f0569ce9b759f50633b04e0ac9e2552a019fe5d729a95e007c70f63d5b2351e7" Dec 04 10:54:10 crc kubenswrapper[4776]: I1204 10:54:10.081569 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/crc-debug-gtl9n" Dec 04 10:54:21 crc kubenswrapper[4776]: I1204 10:54:21.453931 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:54:21 crc kubenswrapper[4776]: E1204 10:54:21.454558 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.072402 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-555f995688-x45jv_71750c70-c6f5-441b-8dae-2c78f53f5e0f/barbican-api/0.log" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.198491 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-555f995688-x45jv_71750c70-c6f5-441b-8dae-2c78f53f5e0f/barbican-api-log/0.log" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.286025 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bddcbff8b-x8j6m_e476a541-1b98-470c-adf7-812cc06763e1/barbican-keystone-listener/0.log" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.617728 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bddcbff8b-x8j6m_e476a541-1b98-470c-adf7-812cc06763e1/barbican-keystone-listener-log/0.log" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.738965 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-854556677-xxtrd_9d48466a-6e63-429a-aba8-cc93741041f4/barbican-worker-log/0.log" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.800038 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-854556677-xxtrd_9d48466a-6e63-429a-aba8-cc93741041f4/barbican-worker/0.log" Dec 04 10:54:24 crc kubenswrapper[4776]: I1204 10:54:24.974809 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78_472ce27b-24c6-4557-9775-971817286847/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.046840 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/ceilometer-central-agent/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.190341 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/ceilometer-notification-agent/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.209286 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/proxy-httpd/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.245398 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/sg-core/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.371047 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm_dd3ec814-a647-4575-abd7-fbdec22fd54f/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.511126 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6_866bd984-5d2f-4eb5-ad8e-e05f3e2d1660/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.640180 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4b7e0c9e-6f33-42f0-af0a-0ec740ba7206/cinder-api/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.688657 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4b7e0c9e-6f33-42f0-af0a-0ec740ba7206/cinder-api-log/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.894131 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_55871647-5a7a-4fbf-954e-67418476628e/probe/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.901617 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_55871647-5a7a-4fbf-954e-67418476628e/cinder-backup/0.log" Dec 04 10:54:25 crc kubenswrapper[4776]: I1204 10:54:25.962804 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_732251b5-2be6-4542-89b4-e20649ec27d0/cinder-scheduler/0.log" Dec 04 10:54:26 crc kubenswrapper[4776]: I1204 10:54:26.690958 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8933a5ae-42ff-44b3-bd28-38a424729b83/probe/0.log" Dec 04 10:54:26 crc kubenswrapper[4776]: I1204 10:54:26.709684 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_732251b5-2be6-4542-89b4-e20649ec27d0/probe/0.log" Dec 04 10:54:26 crc kubenswrapper[4776]: I1204 10:54:26.772538 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8933a5ae-42ff-44b3-bd28-38a424729b83/cinder-volume/0.log" Dec 04 10:54:26 crc kubenswrapper[4776]: I1204 10:54:26.932222 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk_59d725b6-d177-4b01-a89e-8fca3d2127ae/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.067006 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q_fac53d89-7903-4d3b-abea-efbbe8f6a1b3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.179601 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-slhwr_e6596bf3-fdc9-4ccf-b81a-3e5372bef33f/init/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.346774 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-slhwr_e6596bf3-fdc9-4ccf-b81a-3e5372bef33f/init/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.378055 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-slhwr_e6596bf3-fdc9-4ccf-b81a-3e5372bef33f/dnsmasq-dns/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.420806 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77d85923-9fcd-437f-b584-6e86641bccdf/glance-httpd/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.550072 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77d85923-9fcd-437f-b584-6e86641bccdf/glance-log/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.615300 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7cdcbd14-b300-4a1d-b3c5-0cf70e20b290/glance-log/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.664699 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7cdcbd14-b300-4a1d-b3c5-0cf70e20b290/glance-httpd/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.886712 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw_c2066384-4861-4b8b-8a26-ccdafaa3394d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:27 crc kubenswrapper[4776]: I1204 10:54:27.938675 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d787f787d-lqf8p_1ec22398-eab3-46af-8843-1c71a2f5db12/horizon/0.log" Dec 04 10:54:28 crc kubenswrapper[4776]: I1204 10:54:28.010559 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d787f787d-lqf8p_1ec22398-eab3-46af-8843-1c71a2f5db12/horizon-log/0.log" Dec 04 10:54:28 crc kubenswrapper[4776]: I1204 10:54:28.081737 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zhww5_4696f658-d3e7-4aee-9569-80a393613cb9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:28 crc kubenswrapper[4776]: I1204 10:54:28.326631 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414041-vtpvx_9d261d84-4a7d-4b97-bffa-be0cae0c8102/keystone-cron/0.log" Dec 04 10:54:28 crc kubenswrapper[4776]: I1204 10:54:28.359906 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c2d3be34-5565-4f76-8afe-50df0f2a558f/kube-state-metrics/0.log" Dec 04 10:54:28 crc kubenswrapper[4776]: I1204 10:54:28.546246 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq_c95fc34d-f4d9-45d9-acf3-a4fb114a972e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:28 crc kubenswrapper[4776]: I1204 10:54:28.984406 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa8ba719-bab7-4330-97b3-1e1e35d20784/manila-api/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.084985 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_2c756ff2-9b1e-42d0-97ca-e173b0de24d5/probe/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.123409 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_2c756ff2-9b1e-42d0-97ca-e173b0de24d5/manila-scheduler/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.202126 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-86f58b95b9-j2njt_cb3d4759-6025-4713-90f2-7e7825ad18d3/keystone-api/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.368959 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_13270572-bb8d-45b4-aa78-156fc1b09a73/probe/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.642176 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_13270572-bb8d-45b4-aa78-156fc1b09a73/manila-share/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.672785 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa8ba719-bab7-4330-97b3-1e1e35d20784/manila-api-log/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.764386 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5779dfffd5-drdt5_4cbc85fe-8de3-45de-83d6-69da6e1b18d4/neutron-api/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.852858 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5779dfffd5-drdt5_4cbc85fe-8de3-45de-83d6-69da6e1b18d4/neutron-httpd/0.log" Dec 04 10:54:29 crc kubenswrapper[4776]: I1204 10:54:29.963720 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn_c3288d5d-8705-4058-ac67-ef3c5e0e0359/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:30 crc kubenswrapper[4776]: I1204 10:54:30.297574 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_519810d5-1e42-413c-893d-81e992b49d5b/nova-api-log/0.log" Dec 04 10:54:30 crc kubenswrapper[4776]: I1204 10:54:30.414629 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a5ae80cb-2f88-4707-aa4a-777b2d4e3b99/nova-cell0-conductor-conductor/0.log" Dec 04 10:54:30 crc kubenswrapper[4776]: I1204 10:54:30.573282 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_519810d5-1e42-413c-893d-81e992b49d5b/nova-api-api/0.log" Dec 04 10:54:30 crc kubenswrapper[4776]: I1204 10:54:30.627490 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_eb549ab7-99fa-4631-b0b5-d4a029e7de33/nova-cell1-conductor-conductor/0.log" Dec 04 10:54:30 crc kubenswrapper[4776]: I1204 10:54:30.709985 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ea0d417e-f205-4aa7-bc96-ba6879069b4a/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 10:54:30 crc kubenswrapper[4776]: I1204 10:54:30.874274 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9_700d0cc0-f03a-47f4-bb74-d727bda5f904/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.000671 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_840a5d28-ff84-411a-837a-5976118c262d/nova-metadata-log/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.329769 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be0c172a-45d2-4fab-940c-f343c9e227fc/mysql-bootstrap/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.359640 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a3abc5a5-f26f-4c50-9780-b79f683b4243/nova-scheduler-scheduler/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.520206 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be0c172a-45d2-4fab-940c-f343c9e227fc/mysql-bootstrap/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.580900 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be0c172a-45d2-4fab-940c-f343c9e227fc/galera/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.720374 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38109cb-9fe9-429d-b580-999d6978f536/mysql-bootstrap/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.885182 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38109cb-9fe9-429d-b580-999d6978f536/mysql-bootstrap/0.log" Dec 04 10:54:31 crc kubenswrapper[4776]: I1204 10:54:31.944862 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38109cb-9fe9-429d-b580-999d6978f536/galera/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.099890 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d93526f-f97b-4a2a-98b4-4b880a99cbd7/openstackclient/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.206630 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2wd87_ec49e875-c217-4d3a-b821-a870a4ad1d24/openstack-network-exporter/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.402042 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovsdb-server-init/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.451983 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:54:32 crc kubenswrapper[4776]: E1204 10:54:32.452418 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.610148 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovsdb-server/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.610658 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovsdb-server-init/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.625806 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovs-vswitchd/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.634801 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_840a5d28-ff84-411a-837a-5976118c262d/nova-metadata-metadata/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.841552 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-tchdq_1100839e-9cfb-4361-a653-321d0d431072/ovn-controller/0.log" Dec 04 10:54:32 crc kubenswrapper[4776]: I1204 10:54:32.964095 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fhnqg_a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.143804 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b19391ae-29bb-4eef-a99b-c8746488c6f5/openstack-network-exporter/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.147831 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b19391ae-29bb-4eef-a99b-c8746488c6f5/ovn-northd/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.335541 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c35b05be-3fec-4a42-af88-c80ad4c6833e/ovsdbserver-nb/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.350955 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c35b05be-3fec-4a42-af88-c80ad4c6833e/openstack-network-exporter/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.444898 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d1e14cd-4110-4ed1-9884-1318d980a844/openstack-network-exporter/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.559675 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d1e14cd-4110-4ed1-9884-1318d980a844/ovsdbserver-sb/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.699541 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d564c574b-x8jlb_38afdd55-240c-4460-aa5f-2dbbeb0b0f29/placement-api/0.log" Dec 04 10:54:33 crc kubenswrapper[4776]: I1204 10:54:33.774761 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d564c574b-x8jlb_38afdd55-240c-4460-aa5f-2dbbeb0b0f29/placement-log/0.log" Dec 04 10:54:34 crc kubenswrapper[4776]: I1204 10:54:34.369716 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f/setup-container/0.log" Dec 04 10:54:34 crc kubenswrapper[4776]: I1204 10:54:34.561670 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f/setup-container/0.log" Dec 04 10:54:34 crc kubenswrapper[4776]: I1204 10:54:34.574980 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f/rabbitmq/0.log" Dec 04 10:54:34 crc kubenswrapper[4776]: I1204 10:54:34.693046 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b295c1b-fc2b-4e58-9175-992ce31b3a3c/setup-container/0.log" Dec 04 10:54:34 crc kubenswrapper[4776]: I1204 10:54:34.885519 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b295c1b-fc2b-4e58-9175-992ce31b3a3c/setup-container/0.log" Dec 04 10:54:34 crc kubenswrapper[4776]: I1204 10:54:34.937798 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl_9166f367-b1aa-46ad-945d-d1653c18a914/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:35 crc kubenswrapper[4776]: I1204 10:54:35.002384 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b295c1b-fc2b-4e58-9175-992ce31b3a3c/rabbitmq/0.log" Dec 04 10:54:35 crc kubenswrapper[4776]: I1204 10:54:35.178317 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9_6ac53b67-6fc4-413b-b712-80cc35fd786e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:35 crc kubenswrapper[4776]: I1204 10:54:35.238975 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-z8x2x_6b269d5b-a372-4bb4-8e6c-558e97ce60cf/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:35 crc kubenswrapper[4776]: I1204 10:54:35.415794 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-bk6g6_5a1d46aa-2142-479a-9f26-2e8d24b69dca/ssh-known-hosts-edpm-deployment/0.log" Dec 04 10:54:35 crc kubenswrapper[4776]: I1204 10:54:35.649446 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5e42f4d6-4793-4568-9a55-4d346b39dbac/tempest-tests-tempest-tests-runner/0.log" Dec 04 10:54:36 crc kubenswrapper[4776]: I1204 10:54:36.274153 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9grzp_9836203f-04e7-4179-b4fa-8e133dbe8e5a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 10:54:36 crc kubenswrapper[4776]: I1204 10:54:36.317606 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0c9e28cc-5a44-4315-a018-c0678bc68347/test-operator-logs-container/0.log" Dec 04 10:54:45 crc kubenswrapper[4776]: I1204 10:54:45.469557 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:54:45 crc kubenswrapper[4776]: E1204 10:54:45.470486 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:54:50 crc kubenswrapper[4776]: I1204 10:54:50.561740 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c1597214-7e53-46e4-8ba2-3732fc1ebf29/memcached/0.log" Dec 04 10:55:00 crc kubenswrapper[4776]: I1204 10:55:00.452556 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:55:00 crc kubenswrapper[4776]: E1204 10:55:00.453558 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:55:05 crc kubenswrapper[4776]: I1204 10:55:05.230849 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jtqt7_ec5df28d-5944-43f3-bf28-12e1062b1060/kube-rbac-proxy/0.log" Dec 04 10:55:05 crc kubenswrapper[4776]: I1204 10:55:05.730880 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-8h27m_61813ce8-b03b-473b-9606-22515ab1de03/kube-rbac-proxy/0.log" Dec 04 10:55:05 crc kubenswrapper[4776]: I1204 10:55:05.840272 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-8h27m_61813ce8-b03b-473b-9606-22515ab1de03/manager/0.log" Dec 04 10:55:05 crc kubenswrapper[4776]: I1204 10:55:05.871762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jtqt7_ec5df28d-5944-43f3-bf28-12e1062b1060/manager/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.009091 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-z8q57_df5a8995-658c-4525-93ac-604d3c2af213/kube-rbac-proxy/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.095400 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-z8q57_df5a8995-658c-4525-93ac-604d3c2af213/manager/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.200556 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/util/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.382139 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/pull/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.384429 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/util/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.456904 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/pull/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.594996 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/pull/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.614868 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/extract/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.627434 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/util/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.775609 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-d7zhq_2ceaf037-5fce-4ef5-b273-724eb446e0af/kube-rbac-proxy/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.883301 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-d7zhq_2ceaf037-5fce-4ef5-b273-724eb446e0af/manager/0.log" Dec 04 10:55:06 crc kubenswrapper[4776]: I1204 10:55:06.890630 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-x9jlc_25849bc1-46e2-4ff1-a61a-f0b7105290bf/kube-rbac-proxy/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.013943 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-x9jlc_25849bc1-46e2-4ff1-a61a-f0b7105290bf/manager/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.094822 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z6kf6_34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe/kube-rbac-proxy/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.151831 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z6kf6_34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe/manager/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.311840 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fk6f5_a0857db7-00e4-410c-b5a2-945a46ae175a/kube-rbac-proxy/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.433470 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ldf84_58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1/kube-rbac-proxy/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.466880 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fk6f5_a0857db7-00e4-410c-b5a2-945a46ae175a/manager/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.564667 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ldf84_58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1/manager/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.675347 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-zq7wg_6171555b-a2ba-4177-b7d7-3bb5496a99bd/kube-rbac-proxy/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.717391 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-zq7wg_6171555b-a2ba-4177-b7d7-3bb5496a99bd/manager/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.849552 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-79d898f8f7-lbtlb_eca2af80-0e84-4615-9bd7-a907029259e7/kube-rbac-proxy/0.log" Dec 04 10:55:07 crc kubenswrapper[4776]: I1204 10:55:07.977627 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-79d898f8f7-lbtlb_eca2af80-0e84-4615-9bd7-a907029259e7/manager/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.034830 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-5l4h4_fe5ac80c-367a-489b-901e-76d872a26e4b/kube-rbac-proxy/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.122876 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-5l4h4_fe5ac80c-367a-489b-901e-76d872a26e4b/manager/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.251395 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4d8fg_23b5c3d3-b677-4440-b489-9e1811b722bb/kube-rbac-proxy/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.312566 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4d8fg_23b5c3d3-b677-4440-b489-9e1811b722bb/manager/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.426777 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ft7rc_17848cf1-eceb-4e3e-9e39-40a7e4507d6b/kube-rbac-proxy/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.553435 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ft7rc_17848cf1-eceb-4e3e-9e39-40a7e4507d6b/manager/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.673931 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mlnr6_115873e4-456f-4d60-84f0-182f467cb8c0/kube-rbac-proxy/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.718158 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mlnr6_115873e4-456f-4d60-84f0-182f467cb8c0/manager/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.891118 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj_ec5e5439-8cfc-4e75-9627-45e4999aacea/kube-rbac-proxy/0.log" Dec 04 10:55:08 crc kubenswrapper[4776]: I1204 10:55:08.959902 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj_ec5e5439-8cfc-4e75-9627-45e4999aacea/manager/0.log" Dec 04 10:55:09 crc kubenswrapper[4776]: I1204 10:55:09.463554 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qr9vs_f6f9de95-98b9-47ca-b4a0-c5a99ca9a610/registry-server/0.log" Dec 04 10:55:09 crc kubenswrapper[4776]: I1204 10:55:09.536836 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6db4dd56f6-5962s_730ff180-62d9-4a70-b200-e2ac3ea2b4c8/operator/0.log" Dec 04 10:55:09 crc kubenswrapper[4776]: I1204 10:55:09.918683 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hscd7_50a0ede3-8c98-47c6-945e-6aeefa27f86e/kube-rbac-proxy/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.098861 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-pzjlr_8f26eb91-a638-4ba9-9547-7bef2c5513c4/kube-rbac-proxy/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.117528 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hscd7_50a0ede3-8c98-47c6-945e-6aeefa27f86e/manager/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.212549 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-pzjlr_8f26eb91-a638-4ba9-9547-7bef2c5513c4/manager/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.432593 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s767z_615d312b-bd1f-40c3-b499-a7c4ae351cd3/operator/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.456247 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8wwhc_c0269b5f-db90-427e-933b-6221bcfbde9e/kube-rbac-proxy/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.486290 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b845677-nvxnd_72061fb8-5546-4ced-ba4a-f7faeeebec85/manager/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.583980 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8wwhc_c0269b5f-db90-427e-933b-6221bcfbde9e/manager/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.651388 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-mkn8n_725f674d-7785-4bb1-95d2-2a650b9f4df8/kube-rbac-proxy/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.745418 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-mkn8n_725f674d-7785-4bb1-95d2-2a650b9f4df8/manager/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.841457 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wbcs6_f6f8f6ca-820b-41e8-af0a-aa6b439a3dad/kube-rbac-proxy/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.889914 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wbcs6_f6f8f6ca-820b-41e8-af0a-aa6b439a3dad/manager/0.log" Dec 04 10:55:10 crc kubenswrapper[4776]: I1204 10:55:10.973870 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4c2d9_6bab5c22-f51d-4049-adb5-343a7195eeb7/kube-rbac-proxy/0.log" Dec 04 10:55:11 crc kubenswrapper[4776]: I1204 10:55:11.008179 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4c2d9_6bab5c22-f51d-4049-adb5-343a7195eeb7/manager/0.log" Dec 04 10:55:15 crc kubenswrapper[4776]: I1204 10:55:15.455090 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:55:15 crc kubenswrapper[4776]: E1204 10:55:15.456740 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:55:28 crc kubenswrapper[4776]: I1204 10:55:28.452734 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:55:28 crc kubenswrapper[4776]: E1204 10:55:28.453697 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:55:31 crc kubenswrapper[4776]: I1204 10:55:31.703103 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4xnmk_696d9668-ce83-427c-8b8c-cb069a6c1b26/control-plane-machine-set-operator/0.log" Dec 04 10:55:31 crc kubenswrapper[4776]: I1204 10:55:31.825298 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lc8p8_34aa75c9-39fc-49eb-b338-d2b1a36535a8/kube-rbac-proxy/0.log" Dec 04 10:55:31 crc kubenswrapper[4776]: I1204 10:55:31.868228 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lc8p8_34aa75c9-39fc-49eb-b338-d2b1a36535a8/machine-api-operator/0.log" Dec 04 10:55:41 crc kubenswrapper[4776]: I1204 10:55:41.452502 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:55:41 crc kubenswrapper[4776]: E1204 10:55:41.453231 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:55:45 crc kubenswrapper[4776]: I1204 10:55:45.988934 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lxxbd_4ac63db5-784c-4a99-a405-75c3d9f3909c/cert-manager-controller/0.log" Dec 04 10:55:46 crc kubenswrapper[4776]: I1204 10:55:46.113094 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-4vx5h_d19ecdb4-7502-46be-b833-c0f7608c5ce4/cert-manager-cainjector/0.log" Dec 04 10:55:46 crc kubenswrapper[4776]: I1204 10:55:46.168476 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-s9v8v_688382e7-42ed-4f38-bd1e-3a0b40fa42bf/cert-manager-webhook/0.log" Dec 04 10:55:55 crc kubenswrapper[4776]: I1204 10:55:55.459344 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:55:55 crc kubenswrapper[4776]: E1204 10:55:55.460127 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:55:59 crc kubenswrapper[4776]: I1204 10:55:59.062547 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-pndwq_86108b12-167c-4f7f-bbbf-566c1158e81c/nmstate-console-plugin/0.log" Dec 04 10:55:59 crc kubenswrapper[4776]: I1204 10:55:59.322953 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-ffbgz_fe3aabeb-baf7-4d17-ab72-485cb4412799/kube-rbac-proxy/0.log" Dec 04 10:55:59 crc kubenswrapper[4776]: I1204 10:55:59.342248 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2bzm7_99323994-d641-4eb4-b540-41bc2f5241ee/nmstate-handler/0.log" Dec 04 10:55:59 crc kubenswrapper[4776]: I1204 10:55:59.375461 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-ffbgz_fe3aabeb-baf7-4d17-ab72-485cb4412799/nmstate-metrics/0.log" Dec 04 10:55:59 crc kubenswrapper[4776]: I1204 10:55:59.522264 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-4rmmf_af38e5f3-7acd-482c-9561-91789c242956/nmstate-operator/0.log" Dec 04 10:55:59 crc kubenswrapper[4776]: I1204 10:55:59.574912 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-ns5br_f0f2d721-f66d-4f50-8b31-2a879a904faf/nmstate-webhook/0.log" Dec 04 10:56:07 crc kubenswrapper[4776]: I1204 10:56:07.453034 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:56:07 crc kubenswrapper[4776]: E1204 10:56:07.453820 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.306717 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-nk6g7_cac2d534-af69-46ca-ab51-5ba3b56999fe/kube-rbac-proxy/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.474001 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-nk6g7_cac2d534-af69-46ca-ab51-5ba3b56999fe/controller/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.564102 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-rgsnm_eab1cf4a-97de-4d47-a34d-503d31d32d77/frr-k8s-webhook-server/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.679798 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.892492 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.900388 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.905529 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 10:56:14 crc kubenswrapper[4776]: I1204 10:56:14.932197 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.684202 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.711513 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.715844 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.724762 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.901356 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.922614 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.961186 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 10:56:15 crc kubenswrapper[4776]: I1204 10:56:15.969395 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/controller/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.128494 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/frr-metrics/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.179176 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/kube-rbac-proxy-frr/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.185893 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/kube-rbac-proxy/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.400514 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/reloader/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.423701 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-855d6cf46f-579qr_0887beaf-a370-4268-9011-8278551d91bd/manager/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.608197 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-769c6857f6-zvn5n_82bf45e3-e222-4569-bedd-5c160fa3f1d4/webhook-server/0.log" Dec 04 10:56:16 crc kubenswrapper[4776]: I1204 10:56:16.761643 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lp9tk_2ab18bfc-af5b-4be8-b481-7fdc03809bde/kube-rbac-proxy/0.log" Dec 04 10:56:17 crc kubenswrapper[4776]: I1204 10:56:17.322801 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lp9tk_2ab18bfc-af5b-4be8-b481-7fdc03809bde/speaker/0.log" Dec 04 10:56:17 crc kubenswrapper[4776]: I1204 10:56:17.788730 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/frr/0.log" Dec 04 10:56:21 crc kubenswrapper[4776]: I1204 10:56:21.452799 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:56:22 crc kubenswrapper[4776]: I1204 10:56:22.237975 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"bfded1b1474327da4d264d6d8de5c8bf4e532e20c1ed6eee7d088941c6d231dc"} Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.126786 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/util/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.315009 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/pull/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.368031 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/pull/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.388668 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/util/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.541861 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/pull/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.548905 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/util/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.590052 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/extract/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.751511 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/util/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.912016 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/pull/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.921349 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/util/0.log" Dec 04 10:56:29 crc kubenswrapper[4776]: I1204 10:56:29.928269 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/pull/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.510861 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/util/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.534040 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/extract/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.576386 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/pull/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.709545 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-utilities/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.850207 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-utilities/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.868092 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-content/0.log" Dec 04 10:56:30 crc kubenswrapper[4776]: I1204 10:56:30.881542 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-content/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.052833 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-content/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.054220 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-utilities/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.293240 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-utilities/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.469020 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-utilities/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.491949 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-content/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.497348 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-content/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.690609 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/registry-server/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.718999 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-content/0.log" Dec 04 10:56:31 crc kubenswrapper[4776]: I1204 10:56:31.746227 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-utilities/0.log" Dec 04 10:56:32 crc kubenswrapper[4776]: I1204 10:56:32.350210 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nvzbs_d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8/marketplace-operator/0.log" Dec 04 10:56:32 crc kubenswrapper[4776]: I1204 10:56:32.585306 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/registry-server/0.log" Dec 04 10:56:32 crc kubenswrapper[4776]: I1204 10:56:32.659666 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-utilities/0.log" Dec 04 10:56:32 crc kubenswrapper[4776]: I1204 10:56:32.800901 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-content/0.log" Dec 04 10:56:32 crc kubenswrapper[4776]: I1204 10:56:32.808561 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-content/0.log" Dec 04 10:56:32 crc kubenswrapper[4776]: I1204 10:56:32.815493 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-utilities/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.006577 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-content/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.077354 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-utilities/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.100448 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-utilities/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.135442 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/registry-server/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.268699 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-utilities/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.290266 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-content/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.294448 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-content/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.471662 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-content/0.log" Dec 04 10:56:33 crc kubenswrapper[4776]: I1204 10:56:33.473103 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-utilities/0.log" Dec 04 10:56:34 crc kubenswrapper[4776]: I1204 10:56:34.203322 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/registry-server/0.log" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.873757 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9tg6k"] Dec 04 10:57:02 crc kubenswrapper[4776]: E1204 10:57:02.874823 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bfe5f4-2080-4eb4-8a2e-b38600e7af04" containerName="container-00" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.874837 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bfe5f4-2080-4eb4-8a2e-b38600e7af04" containerName="container-00" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.875039 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bfe5f4-2080-4eb4-8a2e-b38600e7af04" containerName="container-00" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.876278 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.903668 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tg6k"] Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.948602 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-utilities\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.948672 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gqw\" (UniqueName: \"kubernetes.io/projected/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-kube-api-access-t8gqw\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:02 crc kubenswrapper[4776]: I1204 10:57:02.948881 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-catalog-content\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.050062 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-catalog-content\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.050202 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-utilities\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.050241 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gqw\" (UniqueName: \"kubernetes.io/projected/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-kube-api-access-t8gqw\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.050587 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-catalog-content\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.050616 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-utilities\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.072427 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gqw\" (UniqueName: \"kubernetes.io/projected/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-kube-api-access-t8gqw\") pod \"community-operators-9tg6k\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:03 crc kubenswrapper[4776]: I1204 10:57:03.197993 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:04 crc kubenswrapper[4776]: I1204 10:57:04.067132 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tg6k"] Dec 04 10:57:04 crc kubenswrapper[4776]: I1204 10:57:04.618092 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerID="bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414" exitCode=0 Dec 04 10:57:04 crc kubenswrapper[4776]: I1204 10:57:04.618293 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tg6k" event={"ID":"1f9ecd83-6424-40f6-80d5-b3c1f8df2374","Type":"ContainerDied","Data":"bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414"} Dec 04 10:57:04 crc kubenswrapper[4776]: I1204 10:57:04.618569 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tg6k" event={"ID":"1f9ecd83-6424-40f6-80d5-b3c1f8df2374","Type":"ContainerStarted","Data":"59c67a1d20766f5aa414255852334391a64d60a424951c61716d22c8864b1136"} Dec 04 10:57:04 crc kubenswrapper[4776]: I1204 10:57:04.621248 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 10:57:05 crc kubenswrapper[4776]: I1204 10:57:05.631717 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerID="8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e" exitCode=0 Dec 04 10:57:05 crc kubenswrapper[4776]: I1204 10:57:05.632625 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tg6k" event={"ID":"1f9ecd83-6424-40f6-80d5-b3c1f8df2374","Type":"ContainerDied","Data":"8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e"} Dec 04 10:57:06 crc kubenswrapper[4776]: I1204 10:57:06.643072 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tg6k" event={"ID":"1f9ecd83-6424-40f6-80d5-b3c1f8df2374","Type":"ContainerStarted","Data":"bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7"} Dec 04 10:57:06 crc kubenswrapper[4776]: I1204 10:57:06.675380 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9tg6k" podStartSLOduration=3.260219142 podStartE2EDuration="4.675359788s" podCreationTimestamp="2025-12-04 10:57:02 +0000 UTC" firstStartedPulling="2025-12-04 10:57:04.620947491 +0000 UTC m=+4669.487427858" lastFinishedPulling="2025-12-04 10:57:06.036088127 +0000 UTC m=+4670.902568504" observedRunningTime="2025-12-04 10:57:06.664863898 +0000 UTC m=+4671.531344285" watchObservedRunningTime="2025-12-04 10:57:06.675359788 +0000 UTC m=+4671.541840155" Dec 04 10:57:13 crc kubenswrapper[4776]: I1204 10:57:13.198155 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:13 crc kubenswrapper[4776]: I1204 10:57:13.198628 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:13 crc kubenswrapper[4776]: I1204 10:57:13.246326 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:13 crc kubenswrapper[4776]: I1204 10:57:13.779135 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:13 crc kubenswrapper[4776]: I1204 10:57:13.839174 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tg6k"] Dec 04 10:57:15 crc kubenswrapper[4776]: I1204 10:57:15.744913 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9tg6k" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="registry-server" containerID="cri-o://bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7" gracePeriod=2 Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.355995 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.358832 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-utilities\") pod \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.360240 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8gqw\" (UniqueName: \"kubernetes.io/projected/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-kube-api-access-t8gqw\") pod \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.361232 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-catalog-content\") pod \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\" (UID: \"1f9ecd83-6424-40f6-80d5-b3c1f8df2374\") " Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.360124 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-utilities" (OuterVolumeSpecName: "utilities") pod "1f9ecd83-6424-40f6-80d5-b3c1f8df2374" (UID: "1f9ecd83-6424-40f6-80d5-b3c1f8df2374"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.362324 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.367608 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-kube-api-access-t8gqw" (OuterVolumeSpecName: "kube-api-access-t8gqw") pod "1f9ecd83-6424-40f6-80d5-b3c1f8df2374" (UID: "1f9ecd83-6424-40f6-80d5-b3c1f8df2374"). InnerVolumeSpecName "kube-api-access-t8gqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.423484 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f9ecd83-6424-40f6-80d5-b3c1f8df2374" (UID: "1f9ecd83-6424-40f6-80d5-b3c1f8df2374"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.465529 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8gqw\" (UniqueName: \"kubernetes.io/projected/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-kube-api-access-t8gqw\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.465588 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9ecd83-6424-40f6-80d5-b3c1f8df2374-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.757397 4776 generic.go:334] "Generic (PLEG): container finished" podID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerID="bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7" exitCode=0 Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.757449 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tg6k" event={"ID":"1f9ecd83-6424-40f6-80d5-b3c1f8df2374","Type":"ContainerDied","Data":"bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7"} Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.757480 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tg6k" event={"ID":"1f9ecd83-6424-40f6-80d5-b3c1f8df2374","Type":"ContainerDied","Data":"59c67a1d20766f5aa414255852334391a64d60a424951c61716d22c8864b1136"} Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.757499 4776 scope.go:117] "RemoveContainer" containerID="bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.757645 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tg6k" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.804599 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tg6k"] Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.814947 4776 scope.go:117] "RemoveContainer" containerID="8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.826902 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9tg6k"] Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.847099 4776 scope.go:117] "RemoveContainer" containerID="bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.883655 4776 scope.go:117] "RemoveContainer" containerID="bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7" Dec 04 10:57:16 crc kubenswrapper[4776]: E1204 10:57:16.884114 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7\": container with ID starting with bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7 not found: ID does not exist" containerID="bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.884154 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7"} err="failed to get container status \"bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7\": rpc error: code = NotFound desc = could not find container \"bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7\": container with ID starting with bc86da7316e464b0196e4b9037294b2033e43763c7bc02afc9da434953e003e7 not found: ID does not exist" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.884182 4776 scope.go:117] "RemoveContainer" containerID="8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e" Dec 04 10:57:16 crc kubenswrapper[4776]: E1204 10:57:16.884535 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e\": container with ID starting with 8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e not found: ID does not exist" containerID="8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.884561 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e"} err="failed to get container status \"8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e\": rpc error: code = NotFound desc = could not find container \"8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e\": container with ID starting with 8d6dce553d64bdbd1510e6e6aa29bc190625e2f6c56eb7fcaed1a74a9ce4ac1e not found: ID does not exist" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.884577 4776 scope.go:117] "RemoveContainer" containerID="bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414" Dec 04 10:57:16 crc kubenswrapper[4776]: E1204 10:57:16.884773 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414\": container with ID starting with bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414 not found: ID does not exist" containerID="bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414" Dec 04 10:57:16 crc kubenswrapper[4776]: I1204 10:57:16.884798 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414"} err="failed to get container status \"bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414\": rpc error: code = NotFound desc = could not find container \"bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414\": container with ID starting with bc0508bc6dd80b0263b163c48dfba8e42c70d6c07b290e814a6297adf6616414 not found: ID does not exist" Dec 04 10:57:17 crc kubenswrapper[4776]: I1204 10:57:17.464546 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" path="/var/lib/kubelet/pods/1f9ecd83-6424-40f6-80d5-b3c1f8df2374/volumes" Dec 04 10:58:29 crc kubenswrapper[4776]: I1204 10:58:29.715041 4776 generic.go:334] "Generic (PLEG): container finished" podID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerID="f6efb918dc55608406785850c34a8b7304e0a7a9791ab8fbd7d4e24ea7742e31" exitCode=0 Dec 04 10:58:29 crc kubenswrapper[4776]: I1204 10:58:29.715212 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99q5c/must-gather-zf8bk" event={"ID":"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320","Type":"ContainerDied","Data":"f6efb918dc55608406785850c34a8b7304e0a7a9791ab8fbd7d4e24ea7742e31"} Dec 04 10:58:29 crc kubenswrapper[4776]: I1204 10:58:29.716489 4776 scope.go:117] "RemoveContainer" containerID="f6efb918dc55608406785850c34a8b7304e0a7a9791ab8fbd7d4e24ea7742e31" Dec 04 10:58:30 crc kubenswrapper[4776]: I1204 10:58:30.623845 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99q5c_must-gather-zf8bk_d6f5f7d0-9c13-4c1c-b13b-718f74ae1320/gather/0.log" Dec 04 10:58:39 crc kubenswrapper[4776]: I1204 10:58:39.615829 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99q5c/must-gather-zf8bk"] Dec 04 10:58:39 crc kubenswrapper[4776]: I1204 10:58:39.616650 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-99q5c/must-gather-zf8bk" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="copy" containerID="cri-o://4373f7be7df1dd0b1c1e3d6c90c675f720c801ac7c4e240265517fe16db63164" gracePeriod=2 Dec 04 10:58:39 crc kubenswrapper[4776]: I1204 10:58:39.625025 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99q5c/must-gather-zf8bk"] Dec 04 10:58:39 crc kubenswrapper[4776]: I1204 10:58:39.802554 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99q5c_must-gather-zf8bk_d6f5f7d0-9c13-4c1c-b13b-718f74ae1320/copy/0.log" Dec 04 10:58:39 crc kubenswrapper[4776]: I1204 10:58:39.803295 4776 generic.go:334] "Generic (PLEG): container finished" podID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerID="4373f7be7df1dd0b1c1e3d6c90c675f720c801ac7c4e240265517fe16db63164" exitCode=143 Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.154178 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99q5c_must-gather-zf8bk_d6f5f7d0-9c13-4c1c-b13b-718f74ae1320/copy/0.log" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.154840 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.289183 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-must-gather-output\") pod \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.289312 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdvk\" (UniqueName: \"kubernetes.io/projected/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-kube-api-access-2pdvk\") pod \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\" (UID: \"d6f5f7d0-9c13-4c1c-b13b-718f74ae1320\") " Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.295484 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-kube-api-access-2pdvk" (OuterVolumeSpecName: "kube-api-access-2pdvk") pod "d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" (UID: "d6f5f7d0-9c13-4c1c-b13b-718f74ae1320"). InnerVolumeSpecName "kube-api-access-2pdvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.393229 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdvk\" (UniqueName: \"kubernetes.io/projected/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-kube-api-access-2pdvk\") on node \"crc\" DevicePath \"\"" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.452384 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" (UID: "d6f5f7d0-9c13-4c1c-b13b-718f74ae1320"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.495935 4776 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.816113 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99q5c_must-gather-zf8bk_d6f5f7d0-9c13-4c1c-b13b-718f74ae1320/copy/0.log" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.816689 4776 scope.go:117] "RemoveContainer" containerID="4373f7be7df1dd0b1c1e3d6c90c675f720c801ac7c4e240265517fe16db63164" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.816724 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99q5c/must-gather-zf8bk" Dec 04 10:58:40 crc kubenswrapper[4776]: I1204 10:58:40.843290 4776 scope.go:117] "RemoveContainer" containerID="f6efb918dc55608406785850c34a8b7304e0a7a9791ab8fbd7d4e24ea7742e31" Dec 04 10:58:41 crc kubenswrapper[4776]: I1204 10:58:41.468555 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" path="/var/lib/kubelet/pods/d6f5f7d0-9c13-4c1c-b13b-718f74ae1320/volumes" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.554464 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4r5x"] Dec 04 10:58:46 crc kubenswrapper[4776]: E1204 10:58:46.555457 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="extract-utilities" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555474 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="extract-utilities" Dec 04 10:58:46 crc kubenswrapper[4776]: E1204 10:58:46.555516 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="extract-content" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555525 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="extract-content" Dec 04 10:58:46 crc kubenswrapper[4776]: E1204 10:58:46.555543 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="gather" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555551 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="gather" Dec 04 10:58:46 crc kubenswrapper[4776]: E1204 10:58:46.555571 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="copy" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555578 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="copy" Dec 04 10:58:46 crc kubenswrapper[4776]: E1204 10:58:46.555590 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="registry-server" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555598 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="registry-server" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555831 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="copy" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555847 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f5f7d0-9c13-4c1c-b13b-718f74ae1320" containerName="gather" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.555880 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9ecd83-6424-40f6-80d5-b3c1f8df2374" containerName="registry-server" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.559170 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.570309 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4r5x"] Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.736754 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-utilities\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.736878 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b2xs\" (UniqueName: \"kubernetes.io/projected/8be1ce14-770d-43dd-b948-f25ecddbf6c3-kube-api-access-6b2xs\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.736904 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-catalog-content\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.838340 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-utilities\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.838449 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b2xs\" (UniqueName: \"kubernetes.io/projected/8be1ce14-770d-43dd-b948-f25ecddbf6c3-kube-api-access-6b2xs\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.838471 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-catalog-content\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.839141 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-utilities\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.839425 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-catalog-content\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.865362 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b2xs\" (UniqueName: \"kubernetes.io/projected/8be1ce14-770d-43dd-b948-f25ecddbf6c3-kube-api-access-6b2xs\") pod \"certified-operators-k4r5x\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:46 crc kubenswrapper[4776]: I1204 10:58:46.881618 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:47 crc kubenswrapper[4776]: I1204 10:58:47.530469 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4r5x"] Dec 04 10:58:47 crc kubenswrapper[4776]: I1204 10:58:47.966925 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerStarted","Data":"99614334341a1ec9473595dd8b5da2c116d0b20dc40248502f3d262891d9d3e3"} Dec 04 10:58:48 crc kubenswrapper[4776]: I1204 10:58:48.977803 4776 generic.go:334] "Generic (PLEG): container finished" podID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerID="9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8" exitCode=0 Dec 04 10:58:48 crc kubenswrapper[4776]: I1204 10:58:48.977888 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerDied","Data":"9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8"} Dec 04 10:58:49 crc kubenswrapper[4776]: I1204 10:58:49.379752 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:58:49 crc kubenswrapper[4776]: I1204 10:58:49.379822 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:58:49 crc kubenswrapper[4776]: I1204 10:58:49.991977 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerStarted","Data":"214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899"} Dec 04 10:58:52 crc kubenswrapper[4776]: I1204 10:58:52.015801 4776 generic.go:334] "Generic (PLEG): container finished" podID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerID="214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899" exitCode=0 Dec 04 10:58:52 crc kubenswrapper[4776]: I1204 10:58:52.016367 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerDied","Data":"214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899"} Dec 04 10:58:53 crc kubenswrapper[4776]: I1204 10:58:53.028807 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerStarted","Data":"2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7"} Dec 04 10:58:53 crc kubenswrapper[4776]: I1204 10:58:53.051704 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4r5x" podStartSLOduration=3.632269619 podStartE2EDuration="7.05168056s" podCreationTimestamp="2025-12-04 10:58:46 +0000 UTC" firstStartedPulling="2025-12-04 10:58:48.998155856 +0000 UTC m=+4773.864636233" lastFinishedPulling="2025-12-04 10:58:52.417566797 +0000 UTC m=+4777.284047174" observedRunningTime="2025-12-04 10:58:53.046446686 +0000 UTC m=+4777.912927063" watchObservedRunningTime="2025-12-04 10:58:53.05168056 +0000 UTC m=+4777.918160937" Dec 04 10:58:56 crc kubenswrapper[4776]: I1204 10:58:56.882126 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:56 crc kubenswrapper[4776]: I1204 10:58:56.882675 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:56 crc kubenswrapper[4776]: I1204 10:58:56.929768 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:57 crc kubenswrapper[4776]: I1204 10:58:57.108458 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:57 crc kubenswrapper[4776]: I1204 10:58:57.168108 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4r5x"] Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.086690 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4r5x" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="registry-server" containerID="cri-o://2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7" gracePeriod=2 Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.527109 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.611210 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-catalog-content\") pod \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.611344 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-utilities\") pod \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.611564 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b2xs\" (UniqueName: \"kubernetes.io/projected/8be1ce14-770d-43dd-b948-f25ecddbf6c3-kube-api-access-6b2xs\") pod \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\" (UID: \"8be1ce14-770d-43dd-b948-f25ecddbf6c3\") " Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.614026 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-utilities" (OuterVolumeSpecName: "utilities") pod "8be1ce14-770d-43dd-b948-f25ecddbf6c3" (UID: "8be1ce14-770d-43dd-b948-f25ecddbf6c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.617191 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be1ce14-770d-43dd-b948-f25ecddbf6c3-kube-api-access-6b2xs" (OuterVolumeSpecName: "kube-api-access-6b2xs") pod "8be1ce14-770d-43dd-b948-f25ecddbf6c3" (UID: "8be1ce14-770d-43dd-b948-f25ecddbf6c3"). InnerVolumeSpecName "kube-api-access-6b2xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.668387 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8be1ce14-770d-43dd-b948-f25ecddbf6c3" (UID: "8be1ce14-770d-43dd-b948-f25ecddbf6c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.716427 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.716505 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be1ce14-770d-43dd-b948-f25ecddbf6c3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:58:59 crc kubenswrapper[4776]: I1204 10:58:59.716522 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b2xs\" (UniqueName: \"kubernetes.io/projected/8be1ce14-770d-43dd-b948-f25ecddbf6c3-kube-api-access-6b2xs\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.101040 4776 generic.go:334] "Generic (PLEG): container finished" podID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerID="2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7" exitCode=0 Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.101137 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4r5x" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.101124 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerDied","Data":"2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7"} Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.101771 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4r5x" event={"ID":"8be1ce14-770d-43dd-b948-f25ecddbf6c3","Type":"ContainerDied","Data":"99614334341a1ec9473595dd8b5da2c116d0b20dc40248502f3d262891d9d3e3"} Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.101797 4776 scope.go:117] "RemoveContainer" containerID="2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.135742 4776 scope.go:117] "RemoveContainer" containerID="214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.147872 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4r5x"] Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.158291 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4r5x"] Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.180303 4776 scope.go:117] "RemoveContainer" containerID="9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.212688 4776 scope.go:117] "RemoveContainer" containerID="2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7" Dec 04 10:59:00 crc kubenswrapper[4776]: E1204 10:59:00.213745 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7\": container with ID starting with 2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7 not found: ID does not exist" containerID="2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.213803 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7"} err="failed to get container status \"2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7\": rpc error: code = NotFound desc = could not find container \"2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7\": container with ID starting with 2017ff56f8a55d18610fbe91015210d772c8dc99560b2fb195c452dc11c8a7e7 not found: ID does not exist" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.213839 4776 scope.go:117] "RemoveContainer" containerID="214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899" Dec 04 10:59:00 crc kubenswrapper[4776]: E1204 10:59:00.214483 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899\": container with ID starting with 214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899 not found: ID does not exist" containerID="214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.214543 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899"} err="failed to get container status \"214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899\": rpc error: code = NotFound desc = could not find container \"214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899\": container with ID starting with 214644db0a2d1bd8c497a1a28d34b4936ee8728ccceedd4ef13e04c9325d1899 not found: ID does not exist" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.214587 4776 scope.go:117] "RemoveContainer" containerID="9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8" Dec 04 10:59:00 crc kubenswrapper[4776]: E1204 10:59:00.215895 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8\": container with ID starting with 9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8 not found: ID does not exist" containerID="9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8" Dec 04 10:59:00 crc kubenswrapper[4776]: I1204 10:59:00.215970 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8"} err="failed to get container status \"9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8\": rpc error: code = NotFound desc = could not find container \"9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8\": container with ID starting with 9339ce5ab4e8e1abe7704fd5066b35115ce31df6d0362cb98c206629d48e58e8 not found: ID does not exist" Dec 04 10:59:01 crc kubenswrapper[4776]: I1204 10:59:01.466417 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" path="/var/lib/kubelet/pods/8be1ce14-770d-43dd-b948-f25ecddbf6c3/volumes" Dec 04 10:59:19 crc kubenswrapper[4776]: I1204 10:59:19.379976 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:59:19 crc kubenswrapper[4776]: I1204 10:59:19.380644 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.940665 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d84tm"] Dec 04 10:59:23 crc kubenswrapper[4776]: E1204 10:59:23.941720 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="registry-server" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.941734 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="registry-server" Dec 04 10:59:23 crc kubenswrapper[4776]: E1204 10:59:23.941748 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="extract-utilities" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.941754 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="extract-utilities" Dec 04 10:59:23 crc kubenswrapper[4776]: E1204 10:59:23.941769 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="extract-content" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.941777 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="extract-content" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.941981 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be1ce14-770d-43dd-b948-f25ecddbf6c3" containerName="registry-server" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.943361 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:23 crc kubenswrapper[4776]: I1204 10:59:23.956134 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d84tm"] Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.101826 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-utilities\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.102202 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g62z\" (UniqueName: \"kubernetes.io/projected/91d8e92d-cee9-4b55-a4d9-27385172f11d-kube-api-access-2g62z\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.102303 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-catalog-content\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.204144 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-utilities\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.204509 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g62z\" (UniqueName: \"kubernetes.io/projected/91d8e92d-cee9-4b55-a4d9-27385172f11d-kube-api-access-2g62z\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.204668 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-catalog-content\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.204734 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-utilities\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.205060 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-catalog-content\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.222578 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g62z\" (UniqueName: \"kubernetes.io/projected/91d8e92d-cee9-4b55-a4d9-27385172f11d-kube-api-access-2g62z\") pod \"redhat-operators-d84tm\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.269765 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:24 crc kubenswrapper[4776]: I1204 10:59:24.740692 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d84tm"] Dec 04 10:59:25 crc kubenswrapper[4776]: I1204 10:59:25.338250 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerStarted","Data":"8e9a96ba8de2140492515f36c41849cb0b245bdc465d63181224d55afa6b1bd8"} Dec 04 10:59:26 crc kubenswrapper[4776]: I1204 10:59:26.350244 4776 generic.go:334] "Generic (PLEG): container finished" podID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerID="a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98" exitCode=0 Dec 04 10:59:26 crc kubenswrapper[4776]: I1204 10:59:26.350367 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerDied","Data":"a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98"} Dec 04 10:59:28 crc kubenswrapper[4776]: I1204 10:59:28.370079 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerStarted","Data":"4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a"} Dec 04 10:59:29 crc kubenswrapper[4776]: I1204 10:59:29.380359 4776 generic.go:334] "Generic (PLEG): container finished" podID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerID="4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a" exitCode=0 Dec 04 10:59:29 crc kubenswrapper[4776]: I1204 10:59:29.380678 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerDied","Data":"4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a"} Dec 04 10:59:30 crc kubenswrapper[4776]: I1204 10:59:30.392751 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerStarted","Data":"12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b"} Dec 04 10:59:30 crc kubenswrapper[4776]: I1204 10:59:30.422828 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d84tm" podStartSLOduration=3.928268649 podStartE2EDuration="7.422808117s" podCreationTimestamp="2025-12-04 10:59:23 +0000 UTC" firstStartedPulling="2025-12-04 10:59:26.353524045 +0000 UTC m=+4811.220004422" lastFinishedPulling="2025-12-04 10:59:29.848063523 +0000 UTC m=+4814.714543890" observedRunningTime="2025-12-04 10:59:30.413363599 +0000 UTC m=+4815.279843986" watchObservedRunningTime="2025-12-04 10:59:30.422808117 +0000 UTC m=+4815.289288494" Dec 04 10:59:34 crc kubenswrapper[4776]: I1204 10:59:34.270442 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:34 crc kubenswrapper[4776]: I1204 10:59:34.272263 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:35 crc kubenswrapper[4776]: I1204 10:59:35.333004 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d84tm" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="registry-server" probeResult="failure" output=< Dec 04 10:59:35 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 10:59:35 crc kubenswrapper[4776]: > Dec 04 10:59:44 crc kubenswrapper[4776]: I1204 10:59:44.316217 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:44 crc kubenswrapper[4776]: I1204 10:59:44.385650 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:44 crc kubenswrapper[4776]: I1204 10:59:44.555391 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d84tm"] Dec 04 10:59:45 crc kubenswrapper[4776]: I1204 10:59:45.521231 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d84tm" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="registry-server" containerID="cri-o://12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b" gracePeriod=2 Dec 04 10:59:45 crc kubenswrapper[4776]: I1204 10:59:45.960711 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.085663 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-catalog-content\") pod \"91d8e92d-cee9-4b55-a4d9-27385172f11d\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.094293 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-utilities\") pod \"91d8e92d-cee9-4b55-a4d9-27385172f11d\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.094583 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g62z\" (UniqueName: \"kubernetes.io/projected/91d8e92d-cee9-4b55-a4d9-27385172f11d-kube-api-access-2g62z\") pod \"91d8e92d-cee9-4b55-a4d9-27385172f11d\" (UID: \"91d8e92d-cee9-4b55-a4d9-27385172f11d\") " Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.095106 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-utilities" (OuterVolumeSpecName: "utilities") pod "91d8e92d-cee9-4b55-a4d9-27385172f11d" (UID: "91d8e92d-cee9-4b55-a4d9-27385172f11d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.096981 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.100403 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d8e92d-cee9-4b55-a4d9-27385172f11d-kube-api-access-2g62z" (OuterVolumeSpecName: "kube-api-access-2g62z") pod "91d8e92d-cee9-4b55-a4d9-27385172f11d" (UID: "91d8e92d-cee9-4b55-a4d9-27385172f11d"). InnerVolumeSpecName "kube-api-access-2g62z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.198499 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g62z\" (UniqueName: \"kubernetes.io/projected/91d8e92d-cee9-4b55-a4d9-27385172f11d-kube-api-access-2g62z\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.221633 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91d8e92d-cee9-4b55-a4d9-27385172f11d" (UID: "91d8e92d-cee9-4b55-a4d9-27385172f11d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.301761 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91d8e92d-cee9-4b55-a4d9-27385172f11d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.531538 4776 generic.go:334] "Generic (PLEG): container finished" podID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerID="12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b" exitCode=0 Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.531635 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d84tm" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.531636 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerDied","Data":"12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b"} Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.533182 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d84tm" event={"ID":"91d8e92d-cee9-4b55-a4d9-27385172f11d","Type":"ContainerDied","Data":"8e9a96ba8de2140492515f36c41849cb0b245bdc465d63181224d55afa6b1bd8"} Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.533269 4776 scope.go:117] "RemoveContainer" containerID="12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.558168 4776 scope.go:117] "RemoveContainer" containerID="4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.581368 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d84tm"] Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.585430 4776 scope.go:117] "RemoveContainer" containerID="a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.590677 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d84tm"] Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.628569 4776 scope.go:117] "RemoveContainer" containerID="12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b" Dec 04 10:59:46 crc kubenswrapper[4776]: E1204 10:59:46.629266 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b\": container with ID starting with 12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b not found: ID does not exist" containerID="12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.629329 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b"} err="failed to get container status \"12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b\": rpc error: code = NotFound desc = could not find container \"12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b\": container with ID starting with 12a7d83871cc39da8f5e7cc7d4ecb9b01ad272a4bc3def587f2b00a5ac48d96b not found: ID does not exist" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.629366 4776 scope.go:117] "RemoveContainer" containerID="4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a" Dec 04 10:59:46 crc kubenswrapper[4776]: E1204 10:59:46.629903 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a\": container with ID starting with 4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a not found: ID does not exist" containerID="4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.629963 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a"} err="failed to get container status \"4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a\": rpc error: code = NotFound desc = could not find container \"4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a\": container with ID starting with 4ae802843706861c755565c513ec874664e8e4eee22fd1f9ab9ed09afb1d3f0a not found: ID does not exist" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.629985 4776 scope.go:117] "RemoveContainer" containerID="a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98" Dec 04 10:59:46 crc kubenswrapper[4776]: E1204 10:59:46.630221 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98\": container with ID starting with a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98 not found: ID does not exist" containerID="a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98" Dec 04 10:59:46 crc kubenswrapper[4776]: I1204 10:59:46.630252 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98"} err="failed to get container status \"a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98\": rpc error: code = NotFound desc = could not find container \"a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98\": container with ID starting with a196607833b0e1f45d47bf8875042ad21d715275fd9beb919ed64b259ae0fa98 not found: ID does not exist" Dec 04 10:59:47 crc kubenswrapper[4776]: I1204 10:59:47.466450 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" path="/var/lib/kubelet/pods/91d8e92d-cee9-4b55-a4d9-27385172f11d/volumes" Dec 04 10:59:49 crc kubenswrapper[4776]: I1204 10:59:49.380406 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 10:59:49 crc kubenswrapper[4776]: I1204 10:59:49.380686 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 10:59:49 crc kubenswrapper[4776]: I1204 10:59:49.380747 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 10:59:49 crc kubenswrapper[4776]: I1204 10:59:49.381690 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfded1b1474327da4d264d6d8de5c8bf4e532e20c1ed6eee7d088941c6d231dc"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 10:59:49 crc kubenswrapper[4776]: I1204 10:59:49.381753 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://bfded1b1474327da4d264d6d8de5c8bf4e532e20c1ed6eee7d088941c6d231dc" gracePeriod=600 Dec 04 10:59:50 crc kubenswrapper[4776]: I1204 10:59:50.570020 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="bfded1b1474327da4d264d6d8de5c8bf4e532e20c1ed6eee7d088941c6d231dc" exitCode=0 Dec 04 10:59:50 crc kubenswrapper[4776]: I1204 10:59:50.570101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"bfded1b1474327da4d264d6d8de5c8bf4e532e20c1ed6eee7d088941c6d231dc"} Dec 04 10:59:50 crc kubenswrapper[4776]: I1204 10:59:50.571478 4776 scope.go:117] "RemoveContainer" containerID="b0d125046c6eebdae7c95c4c382aa242c027c4f693aa4f67a17dccc1e2310306" Dec 04 10:59:51 crc kubenswrapper[4776]: I1204 10:59:51.586637 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce"} Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.979127 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtvw"] Dec 04 10:59:55 crc kubenswrapper[4776]: E1204 10:59:55.980046 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="extract-content" Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.980062 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="extract-content" Dec 04 10:59:55 crc kubenswrapper[4776]: E1204 10:59:55.980070 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="registry-server" Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.980076 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="registry-server" Dec 04 10:59:55 crc kubenswrapper[4776]: E1204 10:59:55.980097 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="extract-utilities" Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.980104 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="extract-utilities" Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.980333 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d8e92d-cee9-4b55-a4d9-27385172f11d" containerName="registry-server" Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.982072 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:55 crc kubenswrapper[4776]: I1204 10:59:55.994590 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtvw"] Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.120621 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-catalog-content\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.121012 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-utilities\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.121136 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvb6r\" (UniqueName: \"kubernetes.io/projected/e305e463-fa94-4700-a355-555f13ce7277-kube-api-access-dvb6r\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.222734 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-catalog-content\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.222825 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-utilities\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.222873 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvb6r\" (UniqueName: \"kubernetes.io/projected/e305e463-fa94-4700-a355-555f13ce7277-kube-api-access-dvb6r\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.223668 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-catalog-content\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.224089 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-utilities\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.242485 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvb6r\" (UniqueName: \"kubernetes.io/projected/e305e463-fa94-4700-a355-555f13ce7277-kube-api-access-dvb6r\") pod \"redhat-marketplace-rwtvw\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.306491 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 10:59:56 crc kubenswrapper[4776]: I1204 10:59:56.791816 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtvw"] Dec 04 10:59:57 crc kubenswrapper[4776]: I1204 10:59:57.637912 4776 generic.go:334] "Generic (PLEG): container finished" podID="e305e463-fa94-4700-a355-555f13ce7277" containerID="586aed5e5544d33f0b69ccf29cf9dc9d2264bfc9607a362ee7ef857d98c9cbea" exitCode=0 Dec 04 10:59:57 crc kubenswrapper[4776]: I1204 10:59:57.638026 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtvw" event={"ID":"e305e463-fa94-4700-a355-555f13ce7277","Type":"ContainerDied","Data":"586aed5e5544d33f0b69ccf29cf9dc9d2264bfc9607a362ee7ef857d98c9cbea"} Dec 04 10:59:57 crc kubenswrapper[4776]: I1204 10:59:57.639976 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtvw" event={"ID":"e305e463-fa94-4700-a355-555f13ce7277","Type":"ContainerStarted","Data":"0bb13fe1103d91a0a1c690f591ff2a5de46506ac221ec275a184a22fb070c61b"} Dec 04 10:59:59 crc kubenswrapper[4776]: I1204 10:59:59.661535 4776 generic.go:334] "Generic (PLEG): container finished" podID="e305e463-fa94-4700-a355-555f13ce7277" containerID="0b8b9331e541c7e94271fa44a2b063ed854dadc2bbef8bc78e874b96d2c0a848" exitCode=0 Dec 04 10:59:59 crc kubenswrapper[4776]: I1204 10:59:59.661646 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtvw" event={"ID":"e305e463-fa94-4700-a355-555f13ce7277","Type":"ContainerDied","Data":"0b8b9331e541c7e94271fa44a2b063ed854dadc2bbef8bc78e874b96d2c0a848"} Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.153843 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8"] Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.155830 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.158903 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.159561 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.164730 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8"] Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.308542 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8718921b-4d14-4185-bdb0-eaadfa5a0722-config-volume\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.308645 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8718921b-4d14-4185-bdb0-eaadfa5a0722-secret-volume\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.308718 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5zxl\" (UniqueName: \"kubernetes.io/projected/8718921b-4d14-4185-bdb0-eaadfa5a0722-kube-api-access-p5zxl\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.411103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8718921b-4d14-4185-bdb0-eaadfa5a0722-config-volume\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.411182 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8718921b-4d14-4185-bdb0-eaadfa5a0722-secret-volume\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.411232 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5zxl\" (UniqueName: \"kubernetes.io/projected/8718921b-4d14-4185-bdb0-eaadfa5a0722-kube-api-access-p5zxl\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.412877 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8718921b-4d14-4185-bdb0-eaadfa5a0722-config-volume\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.419029 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8718921b-4d14-4185-bdb0-eaadfa5a0722-secret-volume\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.430817 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5zxl\" (UniqueName: \"kubernetes.io/projected/8718921b-4d14-4185-bdb0-eaadfa5a0722-kube-api-access-p5zxl\") pod \"collect-profiles-29414100-rvbj8\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:00 crc kubenswrapper[4776]: I1204 11:00:00.531809 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:01 crc kubenswrapper[4776]: I1204 11:00:01.387182 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8"] Dec 04 11:00:01 crc kubenswrapper[4776]: W1204 11:00:01.395035 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8718921b_4d14_4185_bdb0_eaadfa5a0722.slice/crio-4be23917c92788eb4eb492430f90f941a396857e29d3d381f43a5bb2c56e46df WatchSource:0}: Error finding container 4be23917c92788eb4eb492430f90f941a396857e29d3d381f43a5bb2c56e46df: Status 404 returned error can't find the container with id 4be23917c92788eb4eb492430f90f941a396857e29d3d381f43a5bb2c56e46df Dec 04 11:00:01 crc kubenswrapper[4776]: I1204 11:00:01.680309 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" event={"ID":"8718921b-4d14-4185-bdb0-eaadfa5a0722","Type":"ContainerStarted","Data":"4be23917c92788eb4eb492430f90f941a396857e29d3d381f43a5bb2c56e46df"} Dec 04 11:00:01 crc kubenswrapper[4776]: I1204 11:00:01.682712 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtvw" event={"ID":"e305e463-fa94-4700-a355-555f13ce7277","Type":"ContainerStarted","Data":"eb184b96147dfd1359178f3aba7d087979c5e522b59cb0810dc4e22be2b2efc9"} Dec 04 11:00:01 crc kubenswrapper[4776]: I1204 11:00:01.719460 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rwtvw" podStartSLOduration=3.261898386 podStartE2EDuration="6.719439969s" podCreationTimestamp="2025-12-04 10:59:55 +0000 UTC" firstStartedPulling="2025-12-04 10:59:57.639683786 +0000 UTC m=+4842.506164163" lastFinishedPulling="2025-12-04 11:00:01.097225369 +0000 UTC m=+4845.963705746" observedRunningTime="2025-12-04 11:00:01.701766292 +0000 UTC m=+4846.568246689" watchObservedRunningTime="2025-12-04 11:00:01.719439969 +0000 UTC m=+4846.585920346" Dec 04 11:00:02 crc kubenswrapper[4776]: I1204 11:00:02.700907 4776 generic.go:334] "Generic (PLEG): container finished" podID="8718921b-4d14-4185-bdb0-eaadfa5a0722" containerID="7781d923362b2df6bdd2d94617019e614a7d10df76ee095d32142924641b0534" exitCode=0 Dec 04 11:00:02 crc kubenswrapper[4776]: I1204 11:00:02.703101 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" event={"ID":"8718921b-4d14-4185-bdb0-eaadfa5a0722","Type":"ContainerDied","Data":"7781d923362b2df6bdd2d94617019e614a7d10df76ee095d32142924641b0534"} Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.043753 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.189746 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8718921b-4d14-4185-bdb0-eaadfa5a0722-config-volume\") pod \"8718921b-4d14-4185-bdb0-eaadfa5a0722\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.190252 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5zxl\" (UniqueName: \"kubernetes.io/projected/8718921b-4d14-4185-bdb0-eaadfa5a0722-kube-api-access-p5zxl\") pod \"8718921b-4d14-4185-bdb0-eaadfa5a0722\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.190368 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8718921b-4d14-4185-bdb0-eaadfa5a0722-secret-volume\") pod \"8718921b-4d14-4185-bdb0-eaadfa5a0722\" (UID: \"8718921b-4d14-4185-bdb0-eaadfa5a0722\") " Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.190778 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8718921b-4d14-4185-bdb0-eaadfa5a0722-config-volume" (OuterVolumeSpecName: "config-volume") pod "8718921b-4d14-4185-bdb0-eaadfa5a0722" (UID: "8718921b-4d14-4185-bdb0-eaadfa5a0722"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.191151 4776 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8718921b-4d14-4185-bdb0-eaadfa5a0722-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.196098 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8718921b-4d14-4185-bdb0-eaadfa5a0722-kube-api-access-p5zxl" (OuterVolumeSpecName: "kube-api-access-p5zxl") pod "8718921b-4d14-4185-bdb0-eaadfa5a0722" (UID: "8718921b-4d14-4185-bdb0-eaadfa5a0722"). InnerVolumeSpecName "kube-api-access-p5zxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.196089 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8718921b-4d14-4185-bdb0-eaadfa5a0722-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8718921b-4d14-4185-bdb0-eaadfa5a0722" (UID: "8718921b-4d14-4185-bdb0-eaadfa5a0722"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.292605 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5zxl\" (UniqueName: \"kubernetes.io/projected/8718921b-4d14-4185-bdb0-eaadfa5a0722-kube-api-access-p5zxl\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.292641 4776 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8718921b-4d14-4185-bdb0-eaadfa5a0722-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.721983 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" event={"ID":"8718921b-4d14-4185-bdb0-eaadfa5a0722","Type":"ContainerDied","Data":"4be23917c92788eb4eb492430f90f941a396857e29d3d381f43a5bb2c56e46df"} Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.722080 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4be23917c92788eb4eb492430f90f941a396857e29d3d381f43a5bb2c56e46df" Dec 04 11:00:04 crc kubenswrapper[4776]: I1204 11:00:04.722089 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414100-rvbj8" Dec 04 11:00:05 crc kubenswrapper[4776]: I1204 11:00:05.117682 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4"] Dec 04 11:00:05 crc kubenswrapper[4776]: I1204 11:00:05.127093 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414055-ccgt4"] Dec 04 11:00:05 crc kubenswrapper[4776]: I1204 11:00:05.467740 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0dc27f-b4da-4930-9196-3c2d4c21aee2" path="/var/lib/kubelet/pods/5d0dc27f-b4da-4930-9196-3c2d4c21aee2/volumes" Dec 04 11:00:06 crc kubenswrapper[4776]: I1204 11:00:06.307713 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 11:00:06 crc kubenswrapper[4776]: I1204 11:00:06.309383 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 11:00:06 crc kubenswrapper[4776]: I1204 11:00:06.408386 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 11:00:06 crc kubenswrapper[4776]: I1204 11:00:06.801431 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 11:00:06 crc kubenswrapper[4776]: I1204 11:00:06.850400 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtvw"] Dec 04 11:00:08 crc kubenswrapper[4776]: I1204 11:00:08.757526 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rwtvw" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="registry-server" containerID="cri-o://eb184b96147dfd1359178f3aba7d087979c5e522b59cb0810dc4e22be2b2efc9" gracePeriod=2 Dec 04 11:00:09 crc kubenswrapper[4776]: I1204 11:00:09.772507 4776 generic.go:334] "Generic (PLEG): container finished" podID="e305e463-fa94-4700-a355-555f13ce7277" containerID="eb184b96147dfd1359178f3aba7d087979c5e522b59cb0810dc4e22be2b2efc9" exitCode=0 Dec 04 11:00:09 crc kubenswrapper[4776]: I1204 11:00:09.772735 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtvw" event={"ID":"e305e463-fa94-4700-a355-555f13ce7277","Type":"ContainerDied","Data":"eb184b96147dfd1359178f3aba7d087979c5e522b59cb0810dc4e22be2b2efc9"} Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.041841 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.221748 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvb6r\" (UniqueName: \"kubernetes.io/projected/e305e463-fa94-4700-a355-555f13ce7277-kube-api-access-dvb6r\") pod \"e305e463-fa94-4700-a355-555f13ce7277\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.221934 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-catalog-content\") pod \"e305e463-fa94-4700-a355-555f13ce7277\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.221994 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-utilities\") pod \"e305e463-fa94-4700-a355-555f13ce7277\" (UID: \"e305e463-fa94-4700-a355-555f13ce7277\") " Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.223144 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-utilities" (OuterVolumeSpecName: "utilities") pod "e305e463-fa94-4700-a355-555f13ce7277" (UID: "e305e463-fa94-4700-a355-555f13ce7277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.234275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e305e463-fa94-4700-a355-555f13ce7277-kube-api-access-dvb6r" (OuterVolumeSpecName: "kube-api-access-dvb6r") pod "e305e463-fa94-4700-a355-555f13ce7277" (UID: "e305e463-fa94-4700-a355-555f13ce7277"). InnerVolumeSpecName "kube-api-access-dvb6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.242982 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e305e463-fa94-4700-a355-555f13ce7277" (UID: "e305e463-fa94-4700-a355-555f13ce7277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.324562 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvb6r\" (UniqueName: \"kubernetes.io/projected/e305e463-fa94-4700-a355-555f13ce7277-kube-api-access-dvb6r\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.324601 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.324615 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305e463-fa94-4700-a355-555f13ce7277-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.790467 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtvw" event={"ID":"e305e463-fa94-4700-a355-555f13ce7277","Type":"ContainerDied","Data":"0bb13fe1103d91a0a1c690f591ff2a5de46506ac221ec275a184a22fb070c61b"} Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.790539 4776 scope.go:117] "RemoveContainer" containerID="eb184b96147dfd1359178f3aba7d087979c5e522b59cb0810dc4e22be2b2efc9" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.790590 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtvw" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.848207 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtvw"] Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.848731 4776 scope.go:117] "RemoveContainer" containerID="0b8b9331e541c7e94271fa44a2b063ed854dadc2bbef8bc78e874b96d2c0a848" Dec 04 11:00:10 crc kubenswrapper[4776]: I1204 11:00:10.857071 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtvw"] Dec 04 11:00:11 crc kubenswrapper[4776]: I1204 11:00:11.373685 4776 scope.go:117] "RemoveContainer" containerID="586aed5e5544d33f0b69ccf29cf9dc9d2264bfc9607a362ee7ef857d98c9cbea" Dec 04 11:00:11 crc kubenswrapper[4776]: I1204 11:00:11.463359 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e305e463-fa94-4700-a355-555f13ce7277" path="/var/lib/kubelet/pods/e305e463-fa94-4700-a355-555f13ce7277/volumes" Dec 04 11:00:24 crc kubenswrapper[4776]: I1204 11:00:24.452238 4776 scope.go:117] "RemoveContainer" containerID="fbeaf8fe063b7431f9a1de0a404485132d9488982dfe6bc7aae07aadfdc9f89c" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.164104 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414101-gwldl"] Dec 04 11:01:00 crc kubenswrapper[4776]: E1204 11:01:00.168198 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="extract-utilities" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.168424 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="extract-utilities" Dec 04 11:01:00 crc kubenswrapper[4776]: E1204 11:01:00.168525 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8718921b-4d14-4185-bdb0-eaadfa5a0722" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.168613 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="8718921b-4d14-4185-bdb0-eaadfa5a0722" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4776]: E1204 11:01:00.168701 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="registry-server" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.168789 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="registry-server" Dec 04 11:01:00 crc kubenswrapper[4776]: E1204 11:01:00.168895 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="extract-content" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.169001 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="extract-content" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.169333 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="e305e463-fa94-4700-a355-555f13ce7277" containerName="registry-server" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.169450 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="8718921b-4d14-4185-bdb0-eaadfa5a0722" containerName="collect-profiles" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.170463 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.175857 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414101-gwldl"] Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.295334 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-combined-ca-bundle\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.295558 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbgk\" (UniqueName: \"kubernetes.io/projected/56b89857-1720-498c-bdcb-551e42f49053-kube-api-access-gbbgk\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.295617 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-fernet-keys\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.296013 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-config-data\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.398103 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-combined-ca-bundle\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.398179 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbgk\" (UniqueName: \"kubernetes.io/projected/56b89857-1720-498c-bdcb-551e42f49053-kube-api-access-gbbgk\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.398200 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-fernet-keys\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.398292 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-config-data\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.406791 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-combined-ca-bundle\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.407147 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-fernet-keys\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.407241 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-config-data\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.416839 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbgk\" (UniqueName: \"kubernetes.io/projected/56b89857-1720-498c-bdcb-551e42f49053-kube-api-access-gbbgk\") pod \"keystone-cron-29414101-gwldl\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:00 crc kubenswrapper[4776]: I1204 11:01:00.552032 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:01 crc kubenswrapper[4776]: I1204 11:01:01.151128 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414101-gwldl"] Dec 04 11:01:02 crc kubenswrapper[4776]: I1204 11:01:02.155775 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-gwldl" event={"ID":"56b89857-1720-498c-bdcb-551e42f49053","Type":"ContainerStarted","Data":"aa17390e394e201921918ced546066da1ead77ac074e66441907b346bae706d0"} Dec 04 11:01:02 crc kubenswrapper[4776]: I1204 11:01:02.156142 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-gwldl" event={"ID":"56b89857-1720-498c-bdcb-551e42f49053","Type":"ContainerStarted","Data":"5209cde8b4e233fac9257af8963e0d89ce007168964f4246e4d9bae915c387d6"} Dec 04 11:01:02 crc kubenswrapper[4776]: I1204 11:01:02.178231 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414101-gwldl" podStartSLOduration=2.17820744 podStartE2EDuration="2.17820744s" podCreationTimestamp="2025-12-04 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:01:02.173055518 +0000 UTC m=+4907.039535905" watchObservedRunningTime="2025-12-04 11:01:02.17820744 +0000 UTC m=+4907.044687807" Dec 04 11:01:06 crc kubenswrapper[4776]: I1204 11:01:06.199840 4776 generic.go:334] "Generic (PLEG): container finished" podID="56b89857-1720-498c-bdcb-551e42f49053" containerID="aa17390e394e201921918ced546066da1ead77ac074e66441907b346bae706d0" exitCode=0 Dec 04 11:01:06 crc kubenswrapper[4776]: I1204 11:01:06.199993 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-gwldl" event={"ID":"56b89857-1720-498c-bdcb-551e42f49053","Type":"ContainerDied","Data":"aa17390e394e201921918ced546066da1ead77ac074e66441907b346bae706d0"} Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.531220 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.663429 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-fernet-keys\") pod \"56b89857-1720-498c-bdcb-551e42f49053\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.663582 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbbgk\" (UniqueName: \"kubernetes.io/projected/56b89857-1720-498c-bdcb-551e42f49053-kube-api-access-gbbgk\") pod \"56b89857-1720-498c-bdcb-551e42f49053\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.663733 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-combined-ca-bundle\") pod \"56b89857-1720-498c-bdcb-551e42f49053\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.663944 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-config-data\") pod \"56b89857-1720-498c-bdcb-551e42f49053\" (UID: \"56b89857-1720-498c-bdcb-551e42f49053\") " Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.669406 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "56b89857-1720-498c-bdcb-551e42f49053" (UID: "56b89857-1720-498c-bdcb-551e42f49053"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.699158 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b89857-1720-498c-bdcb-551e42f49053-kube-api-access-gbbgk" (OuterVolumeSpecName: "kube-api-access-gbbgk") pod "56b89857-1720-498c-bdcb-551e42f49053" (UID: "56b89857-1720-498c-bdcb-551e42f49053"). InnerVolumeSpecName "kube-api-access-gbbgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.708093 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56b89857-1720-498c-bdcb-551e42f49053" (UID: "56b89857-1720-498c-bdcb-551e42f49053"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.728382 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-config-data" (OuterVolumeSpecName: "config-data") pod "56b89857-1720-498c-bdcb-551e42f49053" (UID: "56b89857-1720-498c-bdcb-551e42f49053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.765947 4776 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.765987 4776 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.765996 4776 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56b89857-1720-498c-bdcb-551e42f49053-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:07 crc kubenswrapper[4776]: I1204 11:01:07.766005 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbbgk\" (UniqueName: \"kubernetes.io/projected/56b89857-1720-498c-bdcb-551e42f49053-kube-api-access-gbbgk\") on node \"crc\" DevicePath \"\"" Dec 04 11:01:08 crc kubenswrapper[4776]: I1204 11:01:08.221652 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414101-gwldl" event={"ID":"56b89857-1720-498c-bdcb-551e42f49053","Type":"ContainerDied","Data":"5209cde8b4e233fac9257af8963e0d89ce007168964f4246e4d9bae915c387d6"} Dec 04 11:01:08 crc kubenswrapper[4776]: I1204 11:01:08.221991 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5209cde8b4e233fac9257af8963e0d89ce007168964f4246e4d9bae915c387d6" Dec 04 11:01:08 crc kubenswrapper[4776]: I1204 11:01:08.221698 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414101-gwldl" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.444410 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5ftlf/must-gather-8c7d2"] Dec 04 11:01:38 crc kubenswrapper[4776]: E1204 11:01:38.445490 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b89857-1720-498c-bdcb-551e42f49053" containerName="keystone-cron" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.445508 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b89857-1720-498c-bdcb-551e42f49053" containerName="keystone-cron" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.445697 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b89857-1720-498c-bdcb-551e42f49053" containerName="keystone-cron" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.446856 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.449409 4776 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5ftlf"/"default-dockercfg-hzzft" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.449580 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5ftlf"/"kube-root-ca.crt" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.450043 4776 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5ftlf"/"openshift-service-ca.crt" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.468169 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5ftlf/must-gather-8c7d2"] Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.540527 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-must-gather-output\") pod \"must-gather-8c7d2\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.540687 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4qz\" (UniqueName: \"kubernetes.io/projected/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-kube-api-access-nq4qz\") pod \"must-gather-8c7d2\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.642700 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4qz\" (UniqueName: \"kubernetes.io/projected/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-kube-api-access-nq4qz\") pod \"must-gather-8c7d2\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.642869 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-must-gather-output\") pod \"must-gather-8c7d2\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.643400 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-must-gather-output\") pod \"must-gather-8c7d2\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.661378 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4qz\" (UniqueName: \"kubernetes.io/projected/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-kube-api-access-nq4qz\") pod \"must-gather-8c7d2\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:38 crc kubenswrapper[4776]: I1204 11:01:38.767849 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:01:39 crc kubenswrapper[4776]: I1204 11:01:39.265276 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5ftlf/must-gather-8c7d2"] Dec 04 11:01:39 crc kubenswrapper[4776]: I1204 11:01:39.501779 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" event={"ID":"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1","Type":"ContainerStarted","Data":"379c7384f0303ba88c048f89670398d7541ea442ec6a4299c07c572d3062a2be"} Dec 04 11:01:40 crc kubenswrapper[4776]: I1204 11:01:40.513339 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" event={"ID":"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1","Type":"ContainerStarted","Data":"c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6"} Dec 04 11:01:40 crc kubenswrapper[4776]: I1204 11:01:40.528552 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" event={"ID":"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1","Type":"ContainerStarted","Data":"1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b"} Dec 04 11:01:40 crc kubenswrapper[4776]: I1204 11:01:40.547668 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" podStartSLOduration=2.5476447540000002 podStartE2EDuration="2.547644754s" podCreationTimestamp="2025-12-04 11:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:01:40.543409031 +0000 UTC m=+4945.409889428" watchObservedRunningTime="2025-12-04 11:01:40.547644754 +0000 UTC m=+4945.414125131" Dec 04 11:01:43 crc kubenswrapper[4776]: I1204 11:01:43.840478 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-75vkk"] Dec 04 11:01:43 crc kubenswrapper[4776]: I1204 11:01:43.842872 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:43 crc kubenswrapper[4776]: I1204 11:01:43.955230 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-host\") pod \"crc-debug-75vkk\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:43 crc kubenswrapper[4776]: I1204 11:01:43.955328 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6q9\" (UniqueName: \"kubernetes.io/projected/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-kube-api-access-hh6q9\") pod \"crc-debug-75vkk\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:44 crc kubenswrapper[4776]: I1204 11:01:44.057531 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6q9\" (UniqueName: \"kubernetes.io/projected/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-kube-api-access-hh6q9\") pod \"crc-debug-75vkk\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:44 crc kubenswrapper[4776]: I1204 11:01:44.057791 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-host\") pod \"crc-debug-75vkk\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:44 crc kubenswrapper[4776]: I1204 11:01:44.058064 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-host\") pod \"crc-debug-75vkk\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:44 crc kubenswrapper[4776]: I1204 11:01:44.096764 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6q9\" (UniqueName: \"kubernetes.io/projected/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-kube-api-access-hh6q9\") pod \"crc-debug-75vkk\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:44 crc kubenswrapper[4776]: I1204 11:01:44.180956 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:01:44 crc kubenswrapper[4776]: W1204 11:01:44.222315 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35cd63df_3f2c_4c9d_a6d0_db170f77e2f9.slice/crio-c9657c89bb327e5fff740d7051bbb3bbb61fbd806cd3e67038e85f318655d791 WatchSource:0}: Error finding container c9657c89bb327e5fff740d7051bbb3bbb61fbd806cd3e67038e85f318655d791: Status 404 returned error can't find the container with id c9657c89bb327e5fff740d7051bbb3bbb61fbd806cd3e67038e85f318655d791 Dec 04 11:01:44 crc kubenswrapper[4776]: I1204 11:01:44.550226 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" event={"ID":"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9","Type":"ContainerStarted","Data":"c9657c89bb327e5fff740d7051bbb3bbb61fbd806cd3e67038e85f318655d791"} Dec 04 11:01:45 crc kubenswrapper[4776]: I1204 11:01:45.561407 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" event={"ID":"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9","Type":"ContainerStarted","Data":"f51881e211a77a06c850d0bfb92f4de2f9e4e458a1a5cd5eff850acb6542f8bb"} Dec 04 11:01:45 crc kubenswrapper[4776]: I1204 11:01:45.585602 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" podStartSLOduration=2.5855761790000003 podStartE2EDuration="2.585576179s" podCreationTimestamp="2025-12-04 11:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:01:45.577447083 +0000 UTC m=+4950.443927460" watchObservedRunningTime="2025-12-04 11:01:45.585576179 +0000 UTC m=+4950.452056556" Dec 04 11:02:19 crc kubenswrapper[4776]: I1204 11:02:19.380052 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:02:19 crc kubenswrapper[4776]: I1204 11:02:19.380574 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:02:25 crc kubenswrapper[4776]: I1204 11:02:25.925232 4776 generic.go:334] "Generic (PLEG): container finished" podID="35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" containerID="f51881e211a77a06c850d0bfb92f4de2f9e4e458a1a5cd5eff850acb6542f8bb" exitCode=0 Dec 04 11:02:25 crc kubenswrapper[4776]: I1204 11:02:25.925305 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" event={"ID":"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9","Type":"ContainerDied","Data":"f51881e211a77a06c850d0bfb92f4de2f9e4e458a1a5cd5eff850acb6542f8bb"} Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.168659 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.217067 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-75vkk"] Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.232858 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-75vkk"] Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.307039 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh6q9\" (UniqueName: \"kubernetes.io/projected/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-kube-api-access-hh6q9\") pod \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.307152 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-host\") pod \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\" (UID: \"35cd63df-3f2c-4c9d-a6d0-db170f77e2f9\") " Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.307275 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-host" (OuterVolumeSpecName: "host") pod "35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" (UID: "35cd63df-3f2c-4c9d-a6d0-db170f77e2f9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.307883 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.315746 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-kube-api-access-hh6q9" (OuterVolumeSpecName: "kube-api-access-hh6q9") pod "35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" (UID: "35cd63df-3f2c-4c9d-a6d0-db170f77e2f9"). InnerVolumeSpecName "kube-api-access-hh6q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.409644 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh6q9\" (UniqueName: \"kubernetes.io/projected/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9-kube-api-access-hh6q9\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.464800 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" path="/var/lib/kubelet/pods/35cd63df-3f2c-4c9d-a6d0-db170f77e2f9/volumes" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.944256 4776 scope.go:117] "RemoveContainer" containerID="f51881e211a77a06c850d0bfb92f4de2f9e4e458a1a5cd5eff850acb6542f8bb" Dec 04 11:02:27 crc kubenswrapper[4776]: I1204 11:02:27.944317 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-75vkk" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.419389 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-cgv4h"] Dec 04 11:02:28 crc kubenswrapper[4776]: E1204 11:02:28.420587 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" containerName="container-00" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.420611 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" containerName="container-00" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.420882 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cd63df-3f2c-4c9d-a6d0-db170f77e2f9" containerName="container-00" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.421972 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.459673 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q98b9\" (UniqueName: \"kubernetes.io/projected/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-kube-api-access-q98b9\") pod \"crc-debug-cgv4h\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.459741 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-host\") pod \"crc-debug-cgv4h\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.561227 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q98b9\" (UniqueName: \"kubernetes.io/projected/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-kube-api-access-q98b9\") pod \"crc-debug-cgv4h\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.562070 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-host\") pod \"crc-debug-cgv4h\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.562210 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-host\") pod \"crc-debug-cgv4h\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.579991 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q98b9\" (UniqueName: \"kubernetes.io/projected/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-kube-api-access-q98b9\") pod \"crc-debug-cgv4h\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.738938 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:28 crc kubenswrapper[4776]: I1204 11:02:28.957529 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" event={"ID":"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2","Type":"ContainerStarted","Data":"91bd5122be47795c317037a676648c42731a86529ecfcc3f5dc056af9ce49e36"} Dec 04 11:02:29 crc kubenswrapper[4776]: I1204 11:02:29.978707 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" event={"ID":"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2","Type":"ContainerStarted","Data":"6ae3c50fce9db7ae5fb920ffd0ed65a57ceb70464354f708f47d9b3c43485439"} Dec 04 11:02:30 crc kubenswrapper[4776]: I1204 11:02:30.000524 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" podStartSLOduration=2.00048522 podStartE2EDuration="2.00048522s" podCreationTimestamp="2025-12-04 11:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:02:29.993561631 +0000 UTC m=+4994.860042008" watchObservedRunningTime="2025-12-04 11:02:30.00048522 +0000 UTC m=+4994.866965597" Dec 04 11:02:30 crc kubenswrapper[4776]: I1204 11:02:30.988550 4776 generic.go:334] "Generic (PLEG): container finished" podID="3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" containerID="6ae3c50fce9db7ae5fb920ffd0ed65a57ceb70464354f708f47d9b3c43485439" exitCode=0 Dec 04 11:02:30 crc kubenswrapper[4776]: I1204 11:02:30.988597 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" event={"ID":"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2","Type":"ContainerDied","Data":"6ae3c50fce9db7ae5fb920ffd0ed65a57ceb70464354f708f47d9b3c43485439"} Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.123647 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.254639 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-host\") pod \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.255292 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q98b9\" (UniqueName: \"kubernetes.io/projected/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-kube-api-access-q98b9\") pod \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\" (UID: \"3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2\") " Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.254718 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-host" (OuterVolumeSpecName: "host") pod "3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" (UID: "3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.256237 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.261575 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-kube-api-access-q98b9" (OuterVolumeSpecName: "kube-api-access-q98b9") pod "3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" (UID: "3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2"). InnerVolumeSpecName "kube-api-access-q98b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.345216 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-cgv4h"] Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.357603 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q98b9\" (UniqueName: \"kubernetes.io/projected/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2-kube-api-access-q98b9\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:32 crc kubenswrapper[4776]: I1204 11:02:32.359159 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-cgv4h"] Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.017364 4776 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91bd5122be47795c317037a676648c42731a86529ecfcc3f5dc056af9ce49e36" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.017477 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-cgv4h" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.465587 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" path="/var/lib/kubelet/pods/3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2/volumes" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.535969 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-zmhr6"] Dec 04 11:02:33 crc kubenswrapper[4776]: E1204 11:02:33.536505 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" containerName="container-00" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.536528 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" containerName="container-00" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.536768 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba38f3b-57ed-4521-a1e7-0b84bc8a5cf2" containerName="container-00" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.539869 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.688622 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/823b4afc-8ed1-4262-9889-11a5551370a8-host\") pod \"crc-debug-zmhr6\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.688760 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bmn\" (UniqueName: \"kubernetes.io/projected/823b4afc-8ed1-4262-9889-11a5551370a8-kube-api-access-f7bmn\") pod \"crc-debug-zmhr6\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.791738 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/823b4afc-8ed1-4262-9889-11a5551370a8-host\") pod \"crc-debug-zmhr6\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.791866 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/823b4afc-8ed1-4262-9889-11a5551370a8-host\") pod \"crc-debug-zmhr6\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.791900 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bmn\" (UniqueName: \"kubernetes.io/projected/823b4afc-8ed1-4262-9889-11a5551370a8-kube-api-access-f7bmn\") pod \"crc-debug-zmhr6\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.813065 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bmn\" (UniqueName: \"kubernetes.io/projected/823b4afc-8ed1-4262-9889-11a5551370a8-kube-api-access-f7bmn\") pod \"crc-debug-zmhr6\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: I1204 11:02:33.857429 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:33 crc kubenswrapper[4776]: W1204 11:02:33.904343 4776 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod823b4afc_8ed1_4262_9889_11a5551370a8.slice/crio-35b8977deb2db09fe888741ee1e63f822dfaa0075edb788b4c4f2c8c31cfc236 WatchSource:0}: Error finding container 35b8977deb2db09fe888741ee1e63f822dfaa0075edb788b4c4f2c8c31cfc236: Status 404 returned error can't find the container with id 35b8977deb2db09fe888741ee1e63f822dfaa0075edb788b4c4f2c8c31cfc236 Dec 04 11:02:34 crc kubenswrapper[4776]: I1204 11:02:34.027122 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" event={"ID":"823b4afc-8ed1-4262-9889-11a5551370a8","Type":"ContainerStarted","Data":"35b8977deb2db09fe888741ee1e63f822dfaa0075edb788b4c4f2c8c31cfc236"} Dec 04 11:02:35 crc kubenswrapper[4776]: I1204 11:02:35.037778 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" event={"ID":"823b4afc-8ed1-4262-9889-11a5551370a8","Type":"ContainerStarted","Data":"c23f4d5394817b1a7aa6895cd206bae12deed1b3b0e97042c8f429b7eaeacc26"} Dec 04 11:02:35 crc kubenswrapper[4776]: I1204 11:02:35.051077 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" podStartSLOduration=2.051059932 podStartE2EDuration="2.051059932s" podCreationTimestamp="2025-12-04 11:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 11:02:35.049632307 +0000 UTC m=+4999.916112684" watchObservedRunningTime="2025-12-04 11:02:35.051059932 +0000 UTC m=+4999.917540309" Dec 04 11:02:36 crc kubenswrapper[4776]: I1204 11:02:36.048411 4776 generic.go:334] "Generic (PLEG): container finished" podID="823b4afc-8ed1-4262-9889-11a5551370a8" containerID="c23f4d5394817b1a7aa6895cd206bae12deed1b3b0e97042c8f429b7eaeacc26" exitCode=0 Dec 04 11:02:36 crc kubenswrapper[4776]: I1204 11:02:36.048540 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" event={"ID":"823b4afc-8ed1-4262-9889-11a5551370a8","Type":"ContainerDied","Data":"c23f4d5394817b1a7aa6895cd206bae12deed1b3b0e97042c8f429b7eaeacc26"} Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.176301 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.208060 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-zmhr6"] Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.219902 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5ftlf/crc-debug-zmhr6"] Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.263502 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/823b4afc-8ed1-4262-9889-11a5551370a8-host\") pod \"823b4afc-8ed1-4262-9889-11a5551370a8\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.263973 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7bmn\" (UniqueName: \"kubernetes.io/projected/823b4afc-8ed1-4262-9889-11a5551370a8-kube-api-access-f7bmn\") pod \"823b4afc-8ed1-4262-9889-11a5551370a8\" (UID: \"823b4afc-8ed1-4262-9889-11a5551370a8\") " Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.263765 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/823b4afc-8ed1-4262-9889-11a5551370a8-host" (OuterVolumeSpecName: "host") pod "823b4afc-8ed1-4262-9889-11a5551370a8" (UID: "823b4afc-8ed1-4262-9889-11a5551370a8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.271601 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823b4afc-8ed1-4262-9889-11a5551370a8-kube-api-access-f7bmn" (OuterVolumeSpecName: "kube-api-access-f7bmn") pod "823b4afc-8ed1-4262-9889-11a5551370a8" (UID: "823b4afc-8ed1-4262-9889-11a5551370a8"). InnerVolumeSpecName "kube-api-access-f7bmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.366012 4776 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/823b4afc-8ed1-4262-9889-11a5551370a8-host\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.366048 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7bmn\" (UniqueName: \"kubernetes.io/projected/823b4afc-8ed1-4262-9889-11a5551370a8-kube-api-access-f7bmn\") on node \"crc\" DevicePath \"\"" Dec 04 11:02:37 crc kubenswrapper[4776]: I1204 11:02:37.466020 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823b4afc-8ed1-4262-9889-11a5551370a8" path="/var/lib/kubelet/pods/823b4afc-8ed1-4262-9889-11a5551370a8/volumes" Dec 04 11:02:37 crc kubenswrapper[4776]: E1204 11:02:37.679819 4776 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod823b4afc_8ed1_4262_9889_11a5551370a8.slice\": RecentStats: unable to find data in memory cache]" Dec 04 11:02:38 crc kubenswrapper[4776]: I1204 11:02:38.066597 4776 scope.go:117] "RemoveContainer" containerID="c23f4d5394817b1a7aa6895cd206bae12deed1b3b0e97042c8f429b7eaeacc26" Dec 04 11:02:38 crc kubenswrapper[4776]: I1204 11:02:38.066632 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/crc-debug-zmhr6" Dec 04 11:02:49 crc kubenswrapper[4776]: I1204 11:02:49.379637 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:02:49 crc kubenswrapper[4776]: I1204 11:02:49.380242 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:03:11 crc kubenswrapper[4776]: I1204 11:03:11.033061 4776 trace.go:236] Trace[1023765035]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-jdzzf" (04-Dec-2025 11:03:08.106) (total time: 2926ms): Dec 04 11:03:11 crc kubenswrapper[4776]: Trace[1023765035]: [2.926482514s] [2.926482514s] END Dec 04 11:03:19 crc kubenswrapper[4776]: I1204 11:03:19.379710 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:03:19 crc kubenswrapper[4776]: I1204 11:03:19.381695 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:03:19 crc kubenswrapper[4776]: I1204 11:03:19.381882 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 11:03:19 crc kubenswrapper[4776]: I1204 11:03:19.382979 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:03:19 crc kubenswrapper[4776]: I1204 11:03:19.383135 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" gracePeriod=600 Dec 04 11:03:20 crc kubenswrapper[4776]: E1204 11:03:20.607656 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:03:20 crc kubenswrapper[4776]: I1204 11:03:20.840850 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" exitCode=0 Dec 04 11:03:20 crc kubenswrapper[4776]: I1204 11:03:20.840899 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce"} Dec 04 11:03:20 crc kubenswrapper[4776]: I1204 11:03:20.841100 4776 scope.go:117] "RemoveContainer" containerID="bfded1b1474327da4d264d6d8de5c8bf4e532e20c1ed6eee7d088941c6d231dc" Dec 04 11:03:20 crc kubenswrapper[4776]: I1204 11:03:20.841846 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:03:20 crc kubenswrapper[4776]: E1204 11:03:20.842252 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:03:32 crc kubenswrapper[4776]: I1204 11:03:32.452554 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:03:32 crc kubenswrapper[4776]: E1204 11:03:32.453706 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:03:33 crc kubenswrapper[4776]: I1204 11:03:33.253646 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-555f995688-x45jv_71750c70-c6f5-441b-8dae-2c78f53f5e0f/barbican-api/0.log" Dec 04 11:03:33 crc kubenswrapper[4776]: I1204 11:03:33.356602 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-555f995688-x45jv_71750c70-c6f5-441b-8dae-2c78f53f5e0f/barbican-api-log/0.log" Dec 04 11:03:33 crc kubenswrapper[4776]: I1204 11:03:33.443850 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bddcbff8b-x8j6m_e476a541-1b98-470c-adf7-812cc06763e1/barbican-keystone-listener/0.log" Dec 04 11:03:33 crc kubenswrapper[4776]: I1204 11:03:33.662700 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-854556677-xxtrd_9d48466a-6e63-429a-aba8-cc93741041f4/barbican-worker/0.log" Dec 04 11:03:33 crc kubenswrapper[4776]: I1204 11:03:33.734038 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bddcbff8b-x8j6m_e476a541-1b98-470c-adf7-812cc06763e1/barbican-keystone-listener-log/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.237781 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-854556677-xxtrd_9d48466a-6e63-429a-aba8-cc93741041f4/barbican-worker-log/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.326138 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-x7k78_472ce27b-24c6-4557-9775-971817286847/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.463004 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/ceilometer-central-agent/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.512610 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/ceilometer-notification-agent/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.545985 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/proxy-httpd/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.632325 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_af73cf01-ace5-4bc7-a209-2f9eb86cb7d6/sg-core/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.710749 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-fmfnm_dd3ec814-a647-4575-abd7-fbdec22fd54f/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:34 crc kubenswrapper[4776]: I1204 11:03:34.871236 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w99q6_866bd984-5d2f-4eb5-ad8e-e05f3e2d1660/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.032628 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4b7e0c9e-6f33-42f0-af0a-0ec740ba7206/cinder-api/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.040290 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4b7e0c9e-6f33-42f0-af0a-0ec740ba7206/cinder-api-log/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.309928 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_55871647-5a7a-4fbf-954e-67418476628e/probe/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.479836 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_732251b5-2be6-4542-89b4-e20649ec27d0/cinder-scheduler/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.532017 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_55871647-5a7a-4fbf-954e-67418476628e/cinder-backup/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.576458 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_732251b5-2be6-4542-89b4-e20649ec27d0/probe/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.764084 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8933a5ae-42ff-44b3-bd28-38a424729b83/cinder-volume/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.806193 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8933a5ae-42ff-44b3-bd28-38a424729b83/probe/0.log" Dec 04 11:03:35 crc kubenswrapper[4776]: I1204 11:03:35.950322 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7l6dk_59d725b6-d177-4b01-a89e-8fca3d2127ae/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.084059 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4tp7q_fac53d89-7903-4d3b-abea-efbbe8f6a1b3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.173290 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-slhwr_e6596bf3-fdc9-4ccf-b81a-3e5372bef33f/init/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.427381 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77d85923-9fcd-437f-b584-6e86641bccdf/glance-httpd/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.445843 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-slhwr_e6596bf3-fdc9-4ccf-b81a-3e5372bef33f/init/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.468723 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-slhwr_e6596bf3-fdc9-4ccf-b81a-3e5372bef33f/dnsmasq-dns/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.484607 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_77d85923-9fcd-437f-b584-6e86641bccdf/glance-log/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.655809 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7cdcbd14-b300-4a1d-b3c5-0cf70e20b290/glance-log/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.662706 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7cdcbd14-b300-4a1d-b3c5-0cf70e20b290/glance-httpd/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.792116 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d787f787d-lqf8p_1ec22398-eab3-46af-8843-1c71a2f5db12/horizon/0.log" Dec 04 11:03:36 crc kubenswrapper[4776]: I1204 11:03:36.907895 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zf2kw_c2066384-4861-4b8b-8a26-ccdafaa3394d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:37 crc kubenswrapper[4776]: I1204 11:03:37.129616 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zhww5_4696f658-d3e7-4aee-9569-80a393613cb9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:37 crc kubenswrapper[4776]: I1204 11:03:37.196900 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d787f787d-lqf8p_1ec22398-eab3-46af-8843-1c71a2f5db12/horizon-log/0.log" Dec 04 11:03:37 crc kubenswrapper[4776]: I1204 11:03:37.376891 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414041-vtpvx_9d261d84-4a7d-4b97-bffa-be0cae0c8102/keystone-cron/0.log" Dec 04 11:03:37 crc kubenswrapper[4776]: I1204 11:03:37.790292 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414101-gwldl_56b89857-1720-498c-bdcb-551e42f49053/keystone-cron/0.log" Dec 04 11:03:37 crc kubenswrapper[4776]: I1204 11:03:37.950226 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c2d3be34-5565-4f76-8afe-50df0f2a558f/kube-state-metrics/0.log" Dec 04 11:03:37 crc kubenswrapper[4776]: I1204 11:03:37.963842 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-86f58b95b9-j2njt_cb3d4759-6025-4713-90f2-7e7825ad18d3/keystone-api/0.log" Dec 04 11:03:38 crc kubenswrapper[4776]: I1204 11:03:38.115990 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-f7xzq_c95fc34d-f4d9-45d9-acf3-a4fb114a972e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:38 crc kubenswrapper[4776]: I1204 11:03:38.507067 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_2c756ff2-9b1e-42d0-97ca-e173b0de24d5/probe/0.log" Dec 04 11:03:38 crc kubenswrapper[4776]: I1204 11:03:38.671225 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa8ba719-bab7-4330-97b3-1e1e35d20784/manila-api/0.log" Dec 04 11:03:38 crc kubenswrapper[4776]: I1204 11:03:38.748345 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_2c756ff2-9b1e-42d0-97ca-e173b0de24d5/manila-scheduler/0.log" Dec 04 11:03:38 crc kubenswrapper[4776]: I1204 11:03:38.855530 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_13270572-bb8d-45b4-aa78-156fc1b09a73/probe/0.log" Dec 04 11:03:39 crc kubenswrapper[4776]: I1204 11:03:39.067543 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_13270572-bb8d-45b4-aa78-156fc1b09a73/manila-share/0.log" Dec 04 11:03:39 crc kubenswrapper[4776]: I1204 11:03:39.328603 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5779dfffd5-drdt5_4cbc85fe-8de3-45de-83d6-69da6e1b18d4/neutron-httpd/0.log" Dec 04 11:03:39 crc kubenswrapper[4776]: I1204 11:03:39.356393 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5779dfffd5-drdt5_4cbc85fe-8de3-45de-83d6-69da6e1b18d4/neutron-api/0.log" Dec 04 11:03:39 crc kubenswrapper[4776]: I1204 11:03:39.388492 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa8ba719-bab7-4330-97b3-1e1e35d20784/manila-api-log/0.log" Dec 04 11:03:39 crc kubenswrapper[4776]: I1204 11:03:39.540331 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4h7gn_c3288d5d-8705-4058-ac67-ef3c5e0e0359/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:39 crc kubenswrapper[4776]: I1204 11:03:39.892055 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_519810d5-1e42-413c-893d-81e992b49d5b/nova-api-log/0.log" Dec 04 11:03:40 crc kubenswrapper[4776]: I1204 11:03:40.153546 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a5ae80cb-2f88-4707-aa4a-777b2d4e3b99/nova-cell0-conductor-conductor/0.log" Dec 04 11:03:40 crc kubenswrapper[4776]: I1204 11:03:40.305505 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_eb549ab7-99fa-4631-b0b5-d4a029e7de33/nova-cell1-conductor-conductor/0.log" Dec 04 11:03:40 crc kubenswrapper[4776]: I1204 11:03:40.371316 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_519810d5-1e42-413c-893d-81e992b49d5b/nova-api-api/0.log" Dec 04 11:03:40 crc kubenswrapper[4776]: I1204 11:03:40.578059 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ea0d417e-f205-4aa7-bc96-ba6879069b4a/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 11:03:40 crc kubenswrapper[4776]: I1204 11:03:40.613166 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-5c9j9_700d0cc0-f03a-47f4-bb74-d727bda5f904/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:40 crc kubenswrapper[4776]: I1204 11:03:40.936375 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_840a5d28-ff84-411a-837a-5976118c262d/nova-metadata-log/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.222572 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a3abc5a5-f26f-4c50-9780-b79f683b4243/nova-scheduler-scheduler/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.222692 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be0c172a-45d2-4fab-940c-f343c9e227fc/mysql-bootstrap/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.393554 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be0c172a-45d2-4fab-940c-f343c9e227fc/mysql-bootstrap/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.471843 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be0c172a-45d2-4fab-940c-f343c9e227fc/galera/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.646138 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38109cb-9fe9-429d-b580-999d6978f536/mysql-bootstrap/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.808868 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38109cb-9fe9-429d-b580-999d6978f536/galera/0.log" Dec 04 11:03:41 crc kubenswrapper[4776]: I1204 11:03:41.872069 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b38109cb-9fe9-429d-b580-999d6978f536/mysql-bootstrap/0.log" Dec 04 11:03:42 crc kubenswrapper[4776]: I1204 11:03:42.033885 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d93526f-f97b-4a2a-98b4-4b880a99cbd7/openstackclient/0.log" Dec 04 11:03:42 crc kubenswrapper[4776]: I1204 11:03:42.202794 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2wd87_ec49e875-c217-4d3a-b821-a870a4ad1d24/openstack-network-exporter/0.log" Dec 04 11:03:42 crc kubenswrapper[4776]: I1204 11:03:42.369882 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovsdb-server-init/0.log" Dec 04 11:03:42 crc kubenswrapper[4776]: I1204 11:03:42.575437 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovs-vswitchd/0.log" Dec 04 11:03:42 crc kubenswrapper[4776]: I1204 11:03:42.588864 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovsdb-server/0.log" Dec 04 11:03:43 crc kubenswrapper[4776]: I1204 11:03:43.142644 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9mnct_bd5ed17f-c4f4-4b17-b14c-d8717fc116f6/ovsdb-server-init/0.log" Dec 04 11:03:43 crc kubenswrapper[4776]: I1204 11:03:43.492162 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-tchdq_1100839e-9cfb-4361-a653-321d0d431072/ovn-controller/0.log" Dec 04 11:03:43 crc kubenswrapper[4776]: I1204 11:03:43.559833 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_840a5d28-ff84-411a-837a-5976118c262d/nova-metadata-metadata/0.log" Dec 04 11:03:43 crc kubenswrapper[4776]: I1204 11:03:43.715140 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fhnqg_a23a4eaf-9bb5-46f5-b8ed-57d77fbb4d0c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:43 crc kubenswrapper[4776]: I1204 11:03:43.824158 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b19391ae-29bb-4eef-a99b-c8746488c6f5/ovn-northd/0.log" Dec 04 11:03:43 crc kubenswrapper[4776]: I1204 11:03:43.840332 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b19391ae-29bb-4eef-a99b-c8746488c6f5/openstack-network-exporter/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.031040 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c35b05be-3fec-4a42-af88-c80ad4c6833e/openstack-network-exporter/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.099824 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c35b05be-3fec-4a42-af88-c80ad4c6833e/ovsdbserver-nb/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.274263 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d1e14cd-4110-4ed1-9884-1318d980a844/openstack-network-exporter/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.318747 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d1e14cd-4110-4ed1-9884-1318d980a844/ovsdbserver-sb/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.444511 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d564c574b-x8jlb_38afdd55-240c-4460-aa5f-2dbbeb0b0f29/placement-api/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.541413 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f/setup-container/0.log" Dec 04 11:03:44 crc kubenswrapper[4776]: I1204 11:03:44.550405 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d564c574b-x8jlb_38afdd55-240c-4460-aa5f-2dbbeb0b0f29/placement-log/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.165154 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f/setup-container/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.232878 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b295c1b-fc2b-4e58-9175-992ce31b3a3c/setup-container/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.250237 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c02f37a-3e0a-48d2-b0ba-40668ac0cc1f/rabbitmq/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.379299 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b295c1b-fc2b-4e58-9175-992ce31b3a3c/setup-container/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.458386 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:03:45 crc kubenswrapper[4776]: E1204 11:03:45.458736 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.467499 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b295c1b-fc2b-4e58-9175-992ce31b3a3c/rabbitmq/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.508322 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pm7tl_9166f367-b1aa-46ad-945d-d1653c18a914/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.678976 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tgwk9_6ac53b67-6fc4-413b-b712-80cc35fd786e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:45 crc kubenswrapper[4776]: I1204 11:03:45.739880 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-z8x2x_6b269d5b-a372-4bb4-8e6c-558e97ce60cf/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:46 crc kubenswrapper[4776]: I1204 11:03:46.208791 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-bk6g6_5a1d46aa-2142-479a-9f26-2e8d24b69dca/ssh-known-hosts-edpm-deployment/0.log" Dec 04 11:03:46 crc kubenswrapper[4776]: I1204 11:03:46.295515 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5e42f4d6-4793-4568-9a55-4d346b39dbac/tempest-tests-tempest-tests-runner/0.log" Dec 04 11:03:46 crc kubenswrapper[4776]: I1204 11:03:46.447654 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0c9e28cc-5a44-4315-a018-c0678bc68347/test-operator-logs-container/0.log" Dec 04 11:03:46 crc kubenswrapper[4776]: I1204 11:03:46.560311 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-9grzp_9836203f-04e7-4179-b4fa-8e133dbe8e5a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 11:03:59 crc kubenswrapper[4776]: I1204 11:03:59.452022 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:03:59 crc kubenswrapper[4776]: E1204 11:03:59.452845 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:04:03 crc kubenswrapper[4776]: I1204 11:04:03.498251 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c1597214-7e53-46e4-8ba2-3732fc1ebf29/memcached/0.log" Dec 04 11:04:12 crc kubenswrapper[4776]: I1204 11:04:12.452454 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:04:12 crc kubenswrapper[4776]: E1204 11:04:12.453235 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:04:16 crc kubenswrapper[4776]: I1204 11:04:16.567813 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jtqt7_ec5df28d-5944-43f3-bf28-12e1062b1060/kube-rbac-proxy/0.log" Dec 04 11:04:16 crc kubenswrapper[4776]: I1204 11:04:16.667033 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jtqt7_ec5df28d-5944-43f3-bf28-12e1062b1060/manager/0.log" Dec 04 11:04:16 crc kubenswrapper[4776]: I1204 11:04:16.811467 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-8h27m_61813ce8-b03b-473b-9606-22515ab1de03/kube-rbac-proxy/0.log" Dec 04 11:04:16 crc kubenswrapper[4776]: I1204 11:04:16.879676 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-8h27m_61813ce8-b03b-473b-9606-22515ab1de03/manager/0.log" Dec 04 11:04:16 crc kubenswrapper[4776]: I1204 11:04:16.989609 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-z8q57_df5a8995-658c-4525-93ac-604d3c2af213/kube-rbac-proxy/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.034045 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-z8q57_df5a8995-658c-4525-93ac-604d3c2af213/manager/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.197547 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/util/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.348938 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/util/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.377755 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/pull/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.394310 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/pull/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.560145 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/extract/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.562480 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/util/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.571990 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f984ac608ff779a5b7f39114f2bcf8382b3ad8fa9cee41d218ea61c2b3hwcpz_14389a80-35f5-46c7-9689-acaa3fd5310d/pull/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.733099 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-d7zhq_2ceaf037-5fce-4ef5-b273-724eb446e0af/kube-rbac-proxy/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.810763 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-x9jlc_25849bc1-46e2-4ff1-a61a-f0b7105290bf/kube-rbac-proxy/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.865338 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-d7zhq_2ceaf037-5fce-4ef5-b273-724eb446e0af/manager/0.log" Dec 04 11:04:17 crc kubenswrapper[4776]: I1204 11:04:17.914947 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-x9jlc_25849bc1-46e2-4ff1-a61a-f0b7105290bf/manager/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.039436 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z6kf6_34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe/kube-rbac-proxy/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.119864 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-z6kf6_34ca0cb2-f4dc-4d1c-b8c4-2aecf1dac1fe/manager/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.221072 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fk6f5_a0857db7-00e4-410c-b5a2-945a46ae175a/kube-rbac-proxy/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.337524 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ldf84_58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1/kube-rbac-proxy/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.422844 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fk6f5_a0857db7-00e4-410c-b5a2-945a46ae175a/manager/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.444025 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-ldf84_58e96f2c-dbd7-4b0f-9a8f-7fa7ff1362f1/manager/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.553979 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-zq7wg_6171555b-a2ba-4177-b7d7-3bb5496a99bd/kube-rbac-proxy/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.679233 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-zq7wg_6171555b-a2ba-4177-b7d7-3bb5496a99bd/manager/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.788143 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-79d898f8f7-lbtlb_eca2af80-0e84-4615-9bd7-a907029259e7/kube-rbac-proxy/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.841496 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-79d898f8f7-lbtlb_eca2af80-0e84-4615-9bd7-a907029259e7/manager/0.log" Dec 04 11:04:18 crc kubenswrapper[4776]: I1204 11:04:18.927351 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-5l4h4_fe5ac80c-367a-489b-901e-76d872a26e4b/kube-rbac-proxy/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.030621 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-5l4h4_fe5ac80c-367a-489b-901e-76d872a26e4b/manager/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.129885 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4d8fg_23b5c3d3-b677-4440-b489-9e1811b722bb/kube-rbac-proxy/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.192011 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4d8fg_23b5c3d3-b677-4440-b489-9e1811b722bb/manager/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.288365 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ft7rc_17848cf1-eceb-4e3e-9e39-40a7e4507d6b/kube-rbac-proxy/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.458853 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-ft7rc_17848cf1-eceb-4e3e-9e39-40a7e4507d6b/manager/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.480852 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mlnr6_115873e4-456f-4d60-84f0-182f467cb8c0/kube-rbac-proxy/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.517181 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mlnr6_115873e4-456f-4d60-84f0-182f467cb8c0/manager/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.710407 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj_ec5e5439-8cfc-4e75-9627-45e4999aacea/kube-rbac-proxy/0.log" Dec 04 11:04:19 crc kubenswrapper[4776]: I1204 11:04:19.807850 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4hthgj_ec5e5439-8cfc-4e75-9627-45e4999aacea/manager/0.log" Dec 04 11:04:20 crc kubenswrapper[4776]: I1204 11:04:20.297343 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6db4dd56f6-5962s_730ff180-62d9-4a70-b200-e2ac3ea2b4c8/operator/0.log" Dec 04 11:04:20 crc kubenswrapper[4776]: I1204 11:04:20.342475 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qr9vs_f6f9de95-98b9-47ca-b4a0-c5a99ca9a610/registry-server/0.log" Dec 04 11:04:20 crc kubenswrapper[4776]: I1204 11:04:20.604785 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hscd7_50a0ede3-8c98-47c6-945e-6aeefa27f86e/kube-rbac-proxy/0.log" Dec 04 11:04:20 crc kubenswrapper[4776]: I1204 11:04:20.705122 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hscd7_50a0ede3-8c98-47c6-945e-6aeefa27f86e/manager/0.log" Dec 04 11:04:20 crc kubenswrapper[4776]: I1204 11:04:20.814742 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-pzjlr_8f26eb91-a638-4ba9-9547-7bef2c5513c4/kube-rbac-proxy/0.log" Dec 04 11:04:20 crc kubenswrapper[4776]: I1204 11:04:20.858119 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-pzjlr_8f26eb91-a638-4ba9-9547-7bef2c5513c4/manager/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.015827 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s767z_615d312b-bd1f-40c3-b499-a7c4ae351cd3/operator/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.078363 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8wwhc_c0269b5f-db90-427e-933b-6221bcfbde9e/kube-rbac-proxy/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.226216 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8wwhc_c0269b5f-db90-427e-933b-6221bcfbde9e/manager/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.322627 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-mkn8n_725f674d-7785-4bb1-95d2-2a650b9f4df8/kube-rbac-proxy/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.501869 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b845677-nvxnd_72061fb8-5546-4ced-ba4a-f7faeeebec85/manager/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.548559 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wbcs6_f6f8f6ca-820b-41e8-af0a-aa6b439a3dad/kube-rbac-proxy/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.600353 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wbcs6_f6f8f6ca-820b-41e8-af0a-aa6b439a3dad/manager/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.608877 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-mkn8n_725f674d-7785-4bb1-95d2-2a650b9f4df8/manager/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.732746 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4c2d9_6bab5c22-f51d-4049-adb5-343a7195eeb7/kube-rbac-proxy/0.log" Dec 04 11:04:21 crc kubenswrapper[4776]: I1204 11:04:21.781540 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4c2d9_6bab5c22-f51d-4049-adb5-343a7195eeb7/manager/0.log" Dec 04 11:04:25 crc kubenswrapper[4776]: I1204 11:04:25.461126 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:04:25 crc kubenswrapper[4776]: E1204 11:04:25.462084 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:04:36 crc kubenswrapper[4776]: I1204 11:04:36.453321 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:04:36 crc kubenswrapper[4776]: E1204 11:04:36.454134 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:04:41 crc kubenswrapper[4776]: I1204 11:04:41.818095 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4xnmk_696d9668-ce83-427c-8b8c-cb069a6c1b26/control-plane-machine-set-operator/0.log" Dec 04 11:04:42 crc kubenswrapper[4776]: I1204 11:04:42.018911 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lc8p8_34aa75c9-39fc-49eb-b338-d2b1a36535a8/machine-api-operator/0.log" Dec 04 11:04:42 crc kubenswrapper[4776]: I1204 11:04:42.037016 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lc8p8_34aa75c9-39fc-49eb-b338-d2b1a36535a8/kube-rbac-proxy/0.log" Dec 04 11:04:50 crc kubenswrapper[4776]: I1204 11:04:50.452128 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:04:50 crc kubenswrapper[4776]: E1204 11:04:50.452947 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:04:57 crc kubenswrapper[4776]: I1204 11:04:57.183669 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-4vx5h_d19ecdb4-7502-46be-b833-c0f7608c5ce4/cert-manager-cainjector/0.log" Dec 04 11:04:57 crc kubenswrapper[4776]: I1204 11:04:57.197566 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lxxbd_4ac63db5-784c-4a99-a405-75c3d9f3909c/cert-manager-controller/0.log" Dec 04 11:04:57 crc kubenswrapper[4776]: I1204 11:04:57.348666 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-s9v8v_688382e7-42ed-4f38-bd1e-3a0b40fa42bf/cert-manager-webhook/0.log" Dec 04 11:05:04 crc kubenswrapper[4776]: I1204 11:05:04.453006 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:05:04 crc kubenswrapper[4776]: E1204 11:05:04.454770 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:05:10 crc kubenswrapper[4776]: I1204 11:05:10.447848 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-pndwq_86108b12-167c-4f7f-bbbf-566c1158e81c/nmstate-console-plugin/0.log" Dec 04 11:05:10 crc kubenswrapper[4776]: I1204 11:05:10.615283 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2bzm7_99323994-d641-4eb4-b540-41bc2f5241ee/nmstate-handler/0.log" Dec 04 11:05:10 crc kubenswrapper[4776]: I1204 11:05:10.678398 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-ffbgz_fe3aabeb-baf7-4d17-ab72-485cb4412799/kube-rbac-proxy/0.log" Dec 04 11:05:10 crc kubenswrapper[4776]: I1204 11:05:10.763174 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-ffbgz_fe3aabeb-baf7-4d17-ab72-485cb4412799/nmstate-metrics/0.log" Dec 04 11:05:10 crc kubenswrapper[4776]: I1204 11:05:10.901369 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-4rmmf_af38e5f3-7acd-482c-9561-91789c242956/nmstate-operator/0.log" Dec 04 11:05:10 crc kubenswrapper[4776]: I1204 11:05:10.980199 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-ns5br_f0f2d721-f66d-4f50-8b31-2a879a904faf/nmstate-webhook/0.log" Dec 04 11:05:15 crc kubenswrapper[4776]: I1204 11:05:15.455707 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:05:15 crc kubenswrapper[4776]: E1204 11:05:15.459639 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:05:26 crc kubenswrapper[4776]: I1204 11:05:26.453053 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:05:26 crc kubenswrapper[4776]: E1204 11:05:26.453895 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.028792 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-nk6g7_cac2d534-af69-46ca-ab51-5ba3b56999fe/controller/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.061411 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-nk6g7_cac2d534-af69-46ca-ab51-5ba3b56999fe/kube-rbac-proxy/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.187424 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-rgsnm_eab1cf4a-97de-4d47-a34d-503d31d32d77/frr-k8s-webhook-server/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.282616 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.464861 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.468186 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.474909 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.502015 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.684737 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.706107 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.714640 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.771123 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.931180 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-frr-files/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.951197 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-reloader/0.log" Dec 04 11:05:27 crc kubenswrapper[4776]: I1204 11:05:27.952781 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/cp-metrics/0.log" Dec 04 11:05:28 crc kubenswrapper[4776]: I1204 11:05:28.014271 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/controller/0.log" Dec 04 11:05:28 crc kubenswrapper[4776]: I1204 11:05:28.785879 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/frr-metrics/0.log" Dec 04 11:05:28 crc kubenswrapper[4776]: I1204 11:05:28.851995 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/kube-rbac-proxy/0.log" Dec 04 11:05:28 crc kubenswrapper[4776]: I1204 11:05:28.915847 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/kube-rbac-proxy-frr/0.log" Dec 04 11:05:29 crc kubenswrapper[4776]: I1204 11:05:29.012271 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/reloader/0.log" Dec 04 11:05:29 crc kubenswrapper[4776]: I1204 11:05:29.261852 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-855d6cf46f-579qr_0887beaf-a370-4268-9011-8278551d91bd/manager/0.log" Dec 04 11:05:29 crc kubenswrapper[4776]: I1204 11:05:29.488685 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-769c6857f6-zvn5n_82bf45e3-e222-4569-bedd-5c160fa3f1d4/webhook-server/0.log" Dec 04 11:05:29 crc kubenswrapper[4776]: I1204 11:05:29.490097 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lp9tk_2ab18bfc-af5b-4be8-b481-7fdc03809bde/kube-rbac-proxy/0.log" Dec 04 11:05:29 crc kubenswrapper[4776]: I1204 11:05:29.703279 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xv8wt_07958ee1-d044-41bd-a405-eb3d7585f036/frr/0.log" Dec 04 11:05:30 crc kubenswrapper[4776]: I1204 11:05:30.005745 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lp9tk_2ab18bfc-af5b-4be8-b481-7fdc03809bde/speaker/0.log" Dec 04 11:05:41 crc kubenswrapper[4776]: I1204 11:05:41.453255 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:05:41 crc kubenswrapper[4776]: E1204 11:05:41.454077 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:05:42 crc kubenswrapper[4776]: I1204 11:05:42.778649 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/util/0.log" Dec 04 11:05:42 crc kubenswrapper[4776]: I1204 11:05:42.912989 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/pull/0.log" Dec 04 11:05:42 crc kubenswrapper[4776]: I1204 11:05:42.915605 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/util/0.log" Dec 04 11:05:42 crc kubenswrapper[4776]: I1204 11:05:42.989766 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/pull/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.186544 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/extract/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.191813 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/util/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.219294 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frqbf9_41f02085-31f0-4a13-a020-8f55ce5e481b/pull/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.378383 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/util/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.544009 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/pull/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.545037 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/util/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.561118 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/pull/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.795206 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/extract/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.803343 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/pull/0.log" Dec 04 11:05:43 crc kubenswrapper[4776]: I1204 11:05:43.809810 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f839rkvt_99270406-e115-46c3-aecb-7155ec24ab04/util/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.001495 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-utilities/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.191283 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-utilities/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.234573 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-content/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.246739 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-content/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.429440 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-utilities/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.436039 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/extract-content/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.651492 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-utilities/0.log" Dec 04 11:05:44 crc kubenswrapper[4776]: I1204 11:05:44.962654 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-content/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.002009 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-utilities/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.004325 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-content/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.243180 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xk7pg_4d5e770b-fc01-43d8-9ebf-4d8a791330b7/registry-server/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.256587 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-content/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.287863 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/extract-utilities/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.482550 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nvzbs_d8f6af2c-c9d1-4e7e-b102-cb6fb4c7fcf8/marketplace-operator/0.log" Dec 04 11:05:45 crc kubenswrapper[4776]: I1204 11:05:45.793552 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-utilities/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.048477 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-utilities/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.088328 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-content/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.097646 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-content/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.291489 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ncssp_e7fd8145-8236-4f79-a7fe-67009d283ef5/registry-server/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.352447 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-content/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.400379 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/extract-utilities/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.553439 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dsqhv_573cfb4b-7da3-471b-b00a-5343818665c2/registry-server/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.616006 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-utilities/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.797072 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-content/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.816639 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-content/0.log" Dec 04 11:05:46 crc kubenswrapper[4776]: I1204 11:05:46.871058 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-utilities/0.log" Dec 04 11:05:47 crc kubenswrapper[4776]: I1204 11:05:47.058076 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-utilities/0.log" Dec 04 11:05:47 crc kubenswrapper[4776]: I1204 11:05:47.068406 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/extract-content/0.log" Dec 04 11:05:47 crc kubenswrapper[4776]: I1204 11:05:47.906888 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-df8j6_6701794f-bfce-4c33-bcd4-08a8225ca4e3/registry-server/0.log" Dec 04 11:05:56 crc kubenswrapper[4776]: I1204 11:05:56.452565 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:05:56 crc kubenswrapper[4776]: E1204 11:05:56.453419 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:06:11 crc kubenswrapper[4776]: I1204 11:06:11.452049 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:06:11 crc kubenswrapper[4776]: E1204 11:06:11.453003 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:06:24 crc kubenswrapper[4776]: I1204 11:06:24.457229 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:06:24 crc kubenswrapper[4776]: E1204 11:06:24.459603 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:06:35 crc kubenswrapper[4776]: I1204 11:06:35.461383 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:06:35 crc kubenswrapper[4776]: E1204 11:06:35.462245 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:06:50 crc kubenswrapper[4776]: I1204 11:06:50.452390 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:06:50 crc kubenswrapper[4776]: E1204 11:06:50.453279 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:07:05 crc kubenswrapper[4776]: I1204 11:07:05.460487 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:07:05 crc kubenswrapper[4776]: E1204 11:07:05.461571 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:07:20 crc kubenswrapper[4776]: I1204 11:07:20.452361 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:07:20 crc kubenswrapper[4776]: E1204 11:07:20.453398 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:07:33 crc kubenswrapper[4776]: I1204 11:07:33.452140 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:07:33 crc kubenswrapper[4776]: E1204 11:07:33.452939 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:07:47 crc kubenswrapper[4776]: I1204 11:07:47.452588 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:07:47 crc kubenswrapper[4776]: E1204 11:07:47.453569 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:07:57 crc kubenswrapper[4776]: I1204 11:07:57.463467 4776 generic.go:334] "Generic (PLEG): container finished" podID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerID="1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b" exitCode=0 Dec 04 11:07:57 crc kubenswrapper[4776]: I1204 11:07:57.464302 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" event={"ID":"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1","Type":"ContainerDied","Data":"1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b"} Dec 04 11:07:57 crc kubenswrapper[4776]: I1204 11:07:57.465041 4776 scope.go:117] "RemoveContainer" containerID="1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b" Dec 04 11:07:58 crc kubenswrapper[4776]: I1204 11:07:58.374344 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5ftlf_must-gather-8c7d2_c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1/gather/0.log" Dec 04 11:08:02 crc kubenswrapper[4776]: I1204 11:08:02.454040 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:08:02 crc kubenswrapper[4776]: E1204 11:08:02.457523 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:08:09 crc kubenswrapper[4776]: I1204 11:08:09.490194 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5ftlf/must-gather-8c7d2"] Dec 04 11:08:09 crc kubenswrapper[4776]: I1204 11:08:09.491624 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="copy" containerID="cri-o://c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6" gracePeriod=2 Dec 04 11:08:09 crc kubenswrapper[4776]: I1204 11:08:09.499274 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5ftlf/must-gather-8c7d2"] Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.346551 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5ftlf_must-gather-8c7d2_c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1/copy/0.log" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.347497 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.450768 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-must-gather-output\") pod \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.451184 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq4qz\" (UniqueName: \"kubernetes.io/projected/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-kube-api-access-nq4qz\") pod \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\" (UID: \"c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1\") " Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.458059 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-kube-api-access-nq4qz" (OuterVolumeSpecName: "kube-api-access-nq4qz") pod "c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" (UID: "c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1"). InnerVolumeSpecName "kube-api-access-nq4qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.554419 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq4qz\" (UniqueName: \"kubernetes.io/projected/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-kube-api-access-nq4qz\") on node \"crc\" DevicePath \"\"" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.593748 4776 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5ftlf_must-gather-8c7d2_c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1/copy/0.log" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.594475 4776 generic.go:334] "Generic (PLEG): container finished" podID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerID="c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6" exitCode=143 Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.594546 4776 scope.go:117] "RemoveContainer" containerID="c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.594702 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ftlf/must-gather-8c7d2" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.623474 4776 scope.go:117] "RemoveContainer" containerID="1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.627092 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" (UID: "c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.657436 4776 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.719297 4776 scope.go:117] "RemoveContainer" containerID="c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6" Dec 04 11:08:10 crc kubenswrapper[4776]: E1204 11:08:10.719994 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6\": container with ID starting with c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6 not found: ID does not exist" containerID="c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.720030 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6"} err="failed to get container status \"c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6\": rpc error: code = NotFound desc = could not find container \"c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6\": container with ID starting with c3467ac4a39a2a4b41c6c2c5742f9bbd22e40c9321af81486bed8e970acc42f6 not found: ID does not exist" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.720052 4776 scope.go:117] "RemoveContainer" containerID="1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b" Dec 04 11:08:10 crc kubenswrapper[4776]: E1204 11:08:10.720355 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b\": container with ID starting with 1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b not found: ID does not exist" containerID="1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b" Dec 04 11:08:10 crc kubenswrapper[4776]: I1204 11:08:10.720382 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b"} err="failed to get container status \"1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b\": rpc error: code = NotFound desc = could not find container \"1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b\": container with ID starting with 1293f77d7528ae583a6847b7bc14c0a3d181bbb7bf7e6fc5a75b046b58cf7f2b not found: ID does not exist" Dec 04 11:08:11 crc kubenswrapper[4776]: I1204 11:08:11.470271 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" path="/var/lib/kubelet/pods/c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1/volumes" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.698664 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wc7sx"] Dec 04 11:08:15 crc kubenswrapper[4776]: E1204 11:08:15.699604 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823b4afc-8ed1-4262-9889-11a5551370a8" containerName="container-00" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.699619 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="823b4afc-8ed1-4262-9889-11a5551370a8" containerName="container-00" Dec 04 11:08:15 crc kubenswrapper[4776]: E1204 11:08:15.699628 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="copy" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.699633 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="copy" Dec 04 11:08:15 crc kubenswrapper[4776]: E1204 11:08:15.699646 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="gather" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.699652 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="gather" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.699856 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="copy" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.699868 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="823b4afc-8ed1-4262-9889-11a5551370a8" containerName="container-00" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.699880 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e4d186-6b0e-4d40-b1d0-ecf2b6170bd1" containerName="gather" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.701490 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.711543 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc7sx"] Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.872053 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-catalog-content\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.872449 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-utilities\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.872739 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlfv\" (UniqueName: \"kubernetes.io/projected/2f38997e-1b70-4ea7-ad2b-18db4582e744-kube-api-access-5rlfv\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.974437 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-catalog-content\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.974594 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-utilities\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.974669 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlfv\" (UniqueName: \"kubernetes.io/projected/2f38997e-1b70-4ea7-ad2b-18db4582e744-kube-api-access-5rlfv\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.974960 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-catalog-content\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.975501 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-utilities\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:15 crc kubenswrapper[4776]: I1204 11:08:15.995010 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlfv\" (UniqueName: \"kubernetes.io/projected/2f38997e-1b70-4ea7-ad2b-18db4582e744-kube-api-access-5rlfv\") pod \"community-operators-wc7sx\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:16 crc kubenswrapper[4776]: I1204 11:08:16.029316 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:16 crc kubenswrapper[4776]: I1204 11:08:16.451844 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:08:16 crc kubenswrapper[4776]: E1204 11:08:16.452249 4776 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d6wbt_openshift-machine-config-operator(a57f7940-a976-4c85-bcb7-a1c24ba08266)\"" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" Dec 04 11:08:16 crc kubenswrapper[4776]: I1204 11:08:16.562240 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc7sx"] Dec 04 11:08:16 crc kubenswrapper[4776]: I1204 11:08:16.660858 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc7sx" event={"ID":"2f38997e-1b70-4ea7-ad2b-18db4582e744","Type":"ContainerStarted","Data":"79adb690ad44077821605d39eb1cfd6d236f71f7169d749c13e8f7f9843e1679"} Dec 04 11:08:17 crc kubenswrapper[4776]: I1204 11:08:17.671431 4776 generic.go:334] "Generic (PLEG): container finished" podID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerID="a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c" exitCode=0 Dec 04 11:08:17 crc kubenswrapper[4776]: I1204 11:08:17.671820 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc7sx" event={"ID":"2f38997e-1b70-4ea7-ad2b-18db4582e744","Type":"ContainerDied","Data":"a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c"} Dec 04 11:08:17 crc kubenswrapper[4776]: I1204 11:08:17.674405 4776 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 11:08:19 crc kubenswrapper[4776]: I1204 11:08:19.693683 4776 generic.go:334] "Generic (PLEG): container finished" podID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerID="5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e" exitCode=0 Dec 04 11:08:19 crc kubenswrapper[4776]: I1204 11:08:19.693741 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc7sx" event={"ID":"2f38997e-1b70-4ea7-ad2b-18db4582e744","Type":"ContainerDied","Data":"5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e"} Dec 04 11:08:20 crc kubenswrapper[4776]: I1204 11:08:20.707463 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc7sx" event={"ID":"2f38997e-1b70-4ea7-ad2b-18db4582e744","Type":"ContainerStarted","Data":"10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3"} Dec 04 11:08:20 crc kubenswrapper[4776]: I1204 11:08:20.736146 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wc7sx" podStartSLOduration=3.036146228 podStartE2EDuration="5.736089745s" podCreationTimestamp="2025-12-04 11:08:15 +0000 UTC" firstStartedPulling="2025-12-04 11:08:17.674080412 +0000 UTC m=+5342.540560789" lastFinishedPulling="2025-12-04 11:08:20.374023929 +0000 UTC m=+5345.240504306" observedRunningTime="2025-12-04 11:08:20.728467735 +0000 UTC m=+5345.594948132" watchObservedRunningTime="2025-12-04 11:08:20.736089745 +0000 UTC m=+5345.602570132" Dec 04 11:08:26 crc kubenswrapper[4776]: I1204 11:08:26.030199 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:26 crc kubenswrapper[4776]: I1204 11:08:26.031676 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:26 crc kubenswrapper[4776]: I1204 11:08:26.080815 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:26 crc kubenswrapper[4776]: I1204 11:08:26.814173 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:26 crc kubenswrapper[4776]: I1204 11:08:26.881561 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc7sx"] Dec 04 11:08:28 crc kubenswrapper[4776]: I1204 11:08:28.778474 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wc7sx" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="registry-server" containerID="cri-o://10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3" gracePeriod=2 Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.329326 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.507018 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rlfv\" (UniqueName: \"kubernetes.io/projected/2f38997e-1b70-4ea7-ad2b-18db4582e744-kube-api-access-5rlfv\") pod \"2f38997e-1b70-4ea7-ad2b-18db4582e744\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.507088 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-utilities\") pod \"2f38997e-1b70-4ea7-ad2b-18db4582e744\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.507352 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-catalog-content\") pod \"2f38997e-1b70-4ea7-ad2b-18db4582e744\" (UID: \"2f38997e-1b70-4ea7-ad2b-18db4582e744\") " Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.509057 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-utilities" (OuterVolumeSpecName: "utilities") pod "2f38997e-1b70-4ea7-ad2b-18db4582e744" (UID: "2f38997e-1b70-4ea7-ad2b-18db4582e744"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.516965 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f38997e-1b70-4ea7-ad2b-18db4582e744-kube-api-access-5rlfv" (OuterVolumeSpecName: "kube-api-access-5rlfv") pod "2f38997e-1b70-4ea7-ad2b-18db4582e744" (UID: "2f38997e-1b70-4ea7-ad2b-18db4582e744"). InnerVolumeSpecName "kube-api-access-5rlfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.557024 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f38997e-1b70-4ea7-ad2b-18db4582e744" (UID: "2f38997e-1b70-4ea7-ad2b-18db4582e744"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.610383 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.610540 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rlfv\" (UniqueName: \"kubernetes.io/projected/2f38997e-1b70-4ea7-ad2b-18db4582e744-kube-api-access-5rlfv\") on node \"crc\" DevicePath \"\"" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.610579 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f38997e-1b70-4ea7-ad2b-18db4582e744-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.790189 4776 generic.go:334] "Generic (PLEG): container finished" podID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerID="10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3" exitCode=0 Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.790247 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc7sx" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.790263 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc7sx" event={"ID":"2f38997e-1b70-4ea7-ad2b-18db4582e744","Type":"ContainerDied","Data":"10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3"} Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.791796 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc7sx" event={"ID":"2f38997e-1b70-4ea7-ad2b-18db4582e744","Type":"ContainerDied","Data":"79adb690ad44077821605d39eb1cfd6d236f71f7169d749c13e8f7f9843e1679"} Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.791823 4776 scope.go:117] "RemoveContainer" containerID="10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.818897 4776 scope.go:117] "RemoveContainer" containerID="5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.826376 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc7sx"] Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.836202 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wc7sx"] Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.853315 4776 scope.go:117] "RemoveContainer" containerID="a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.893961 4776 scope.go:117] "RemoveContainer" containerID="10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3" Dec 04 11:08:29 crc kubenswrapper[4776]: E1204 11:08:29.894660 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3\": container with ID starting with 10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3 not found: ID does not exist" containerID="10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.894701 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3"} err="failed to get container status \"10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3\": rpc error: code = NotFound desc = could not find container \"10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3\": container with ID starting with 10b2a18754d9fafdd049db1278a28a9b4172d7f0703269cd6db8295246497df3 not found: ID does not exist" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.894751 4776 scope.go:117] "RemoveContainer" containerID="5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e" Dec 04 11:08:29 crc kubenswrapper[4776]: E1204 11:08:29.895341 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e\": container with ID starting with 5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e not found: ID does not exist" containerID="5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.895386 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e"} err="failed to get container status \"5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e\": rpc error: code = NotFound desc = could not find container \"5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e\": container with ID starting with 5c5055801385b20f549a91de8a447784547c00267407766a7b567816884d822e not found: ID does not exist" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.895400 4776 scope.go:117] "RemoveContainer" containerID="a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c" Dec 04 11:08:29 crc kubenswrapper[4776]: E1204 11:08:29.895891 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c\": container with ID starting with a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c not found: ID does not exist" containerID="a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c" Dec 04 11:08:29 crc kubenswrapper[4776]: I1204 11:08:29.895960 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c"} err="failed to get container status \"a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c\": rpc error: code = NotFound desc = could not find container \"a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c\": container with ID starting with a2d7c3bbc881f3050c10a4626ee99688c903a3f3890894af3f7f845666de378c not found: ID does not exist" Dec 04 11:08:31 crc kubenswrapper[4776]: I1204 11:08:31.452850 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:08:31 crc kubenswrapper[4776]: I1204 11:08:31.472449 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" path="/var/lib/kubelet/pods/2f38997e-1b70-4ea7-ad2b-18db4582e744/volumes" Dec 04 11:08:31 crc kubenswrapper[4776]: I1204 11:08:31.814064 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"608367ee3a08b840457ffc1fb6d507de2776bc3f97adac1654f0ffc37ddf6373"} Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.417115 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rz2qv"] Dec 04 11:09:18 crc kubenswrapper[4776]: E1204 11:09:18.418295 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="extract-utilities" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.418313 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="extract-utilities" Dec 04 11:09:18 crc kubenswrapper[4776]: E1204 11:09:18.418331 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="extract-content" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.418337 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="extract-content" Dec 04 11:09:18 crc kubenswrapper[4776]: E1204 11:09:18.418347 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="registry-server" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.418353 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="registry-server" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.418563 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f38997e-1b70-4ea7-ad2b-18db4582e744" containerName="registry-server" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.420156 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.429567 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rz2qv"] Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.481861 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-catalog-content\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.481942 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-utilities\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.482372 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtfbt\" (UniqueName: \"kubernetes.io/projected/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-kube-api-access-qtfbt\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.583883 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-catalog-content\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.583963 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-utilities\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.584131 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtfbt\" (UniqueName: \"kubernetes.io/projected/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-kube-api-access-qtfbt\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.584438 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-catalog-content\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.584674 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-utilities\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.604517 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtfbt\" (UniqueName: \"kubernetes.io/projected/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-kube-api-access-qtfbt\") pod \"certified-operators-rz2qv\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:18 crc kubenswrapper[4776]: I1204 11:09:18.742646 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:19 crc kubenswrapper[4776]: I1204 11:09:19.549775 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rz2qv"] Dec 04 11:09:20 crc kubenswrapper[4776]: I1204 11:09:20.315168 4776 generic.go:334] "Generic (PLEG): container finished" podID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerID="a752fef20feef03acec5564d9b2bc0901eab3ab8d9a96c495d28cc5c4e9e1263" exitCode=0 Dec 04 11:09:20 crc kubenswrapper[4776]: I1204 11:09:20.315238 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerDied","Data":"a752fef20feef03acec5564d9b2bc0901eab3ab8d9a96c495d28cc5c4e9e1263"} Dec 04 11:09:20 crc kubenswrapper[4776]: I1204 11:09:20.315545 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerStarted","Data":"6b1a0550a31c37fac6b460bcf8543c4fe517d28714aee4f079b2f3c4fa7600c0"} Dec 04 11:09:21 crc kubenswrapper[4776]: I1204 11:09:21.328697 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerStarted","Data":"2db13339ea3e00a34fb3c5f32ad240a7d423a66fbe32a29f4fa243682ec3f7c6"} Dec 04 11:09:22 crc kubenswrapper[4776]: I1204 11:09:22.339571 4776 generic.go:334] "Generic (PLEG): container finished" podID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerID="2db13339ea3e00a34fb3c5f32ad240a7d423a66fbe32a29f4fa243682ec3f7c6" exitCode=0 Dec 04 11:09:22 crc kubenswrapper[4776]: I1204 11:09:22.339645 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerDied","Data":"2db13339ea3e00a34fb3c5f32ad240a7d423a66fbe32a29f4fa243682ec3f7c6"} Dec 04 11:09:23 crc kubenswrapper[4776]: I1204 11:09:23.351527 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerStarted","Data":"3046f3ff7e84f169e958658055d932d6954655cdba14a2f1add412a929c1465d"} Dec 04 11:09:23 crc kubenswrapper[4776]: I1204 11:09:23.375815 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rz2qv" podStartSLOduration=2.972383152 podStartE2EDuration="5.375773269s" podCreationTimestamp="2025-12-04 11:09:18 +0000 UTC" firstStartedPulling="2025-12-04 11:09:20.317496153 +0000 UTC m=+5405.183976530" lastFinishedPulling="2025-12-04 11:09:22.72088627 +0000 UTC m=+5407.587366647" observedRunningTime="2025-12-04 11:09:23.37165989 +0000 UTC m=+5408.238140277" watchObservedRunningTime="2025-12-04 11:09:23.375773269 +0000 UTC m=+5408.242253646" Dec 04 11:09:24 crc kubenswrapper[4776]: I1204 11:09:24.713838 4776 scope.go:117] "RemoveContainer" containerID="6ae3c50fce9db7ae5fb920ffd0ed65a57ceb70464354f708f47d9b3c43485439" Dec 04 11:09:28 crc kubenswrapper[4776]: I1204 11:09:28.742792 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:28 crc kubenswrapper[4776]: I1204 11:09:28.743462 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:28 crc kubenswrapper[4776]: I1204 11:09:28.793864 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.215847 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xst6x"] Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.220283 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.229472 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xst6x"] Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.273884 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-utilities\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.273985 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-catalog-content\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.274084 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-949cs\" (UniqueName: \"kubernetes.io/projected/0a930c48-200d-48e1-9473-6b6faaf00524-kube-api-access-949cs\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.375688 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-utilities\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.375766 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-catalog-content\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.375841 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-949cs\" (UniqueName: \"kubernetes.io/projected/0a930c48-200d-48e1-9473-6b6faaf00524-kube-api-access-949cs\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.376380 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-utilities\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.376619 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-catalog-content\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.398395 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-949cs\" (UniqueName: \"kubernetes.io/projected/0a930c48-200d-48e1-9473-6b6faaf00524-kube-api-access-949cs\") pod \"redhat-operators-xst6x\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.463878 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:29 crc kubenswrapper[4776]: I1204 11:09:29.554451 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:30 crc kubenswrapper[4776]: I1204 11:09:30.064711 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xst6x"] Dec 04 11:09:30 crc kubenswrapper[4776]: I1204 11:09:30.202558 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rz2qv"] Dec 04 11:09:30 crc kubenswrapper[4776]: I1204 11:09:30.414555 4776 generic.go:334] "Generic (PLEG): container finished" podID="0a930c48-200d-48e1-9473-6b6faaf00524" containerID="47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623" exitCode=0 Dec 04 11:09:30 crc kubenswrapper[4776]: I1204 11:09:30.416091 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerDied","Data":"47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623"} Dec 04 11:09:30 crc kubenswrapper[4776]: I1204 11:09:30.416120 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerStarted","Data":"1d2a0082594c80e15bdc53a1ac7d6c0b5cc471bf5f999e77b9f3a48ce3ba3742"} Dec 04 11:09:31 crc kubenswrapper[4776]: I1204 11:09:31.422784 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rz2qv" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="registry-server" containerID="cri-o://3046f3ff7e84f169e958658055d932d6954655cdba14a2f1add412a929c1465d" gracePeriod=2 Dec 04 11:09:32 crc kubenswrapper[4776]: I1204 11:09:32.434133 4776 generic.go:334] "Generic (PLEG): container finished" podID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerID="3046f3ff7e84f169e958658055d932d6954655cdba14a2f1add412a929c1465d" exitCode=0 Dec 04 11:09:32 crc kubenswrapper[4776]: I1204 11:09:32.434176 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerDied","Data":"3046f3ff7e84f169e958658055d932d6954655cdba14a2f1add412a929c1465d"} Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.056375 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.148959 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-utilities\") pod \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.149171 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-catalog-content\") pod \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.149225 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtfbt\" (UniqueName: \"kubernetes.io/projected/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-kube-api-access-qtfbt\") pod \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\" (UID: \"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f\") " Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.150173 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-utilities" (OuterVolumeSpecName: "utilities") pod "56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" (UID: "56a31572-cf5c-4a2f-9f35-cd2c5b6a294f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.156603 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-kube-api-access-qtfbt" (OuterVolumeSpecName: "kube-api-access-qtfbt") pod "56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" (UID: "56a31572-cf5c-4a2f-9f35-cd2c5b6a294f"). InnerVolumeSpecName "kube-api-access-qtfbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.212938 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" (UID: "56a31572-cf5c-4a2f-9f35-cd2c5b6a294f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.251856 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.251913 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.251962 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtfbt\" (UniqueName: \"kubernetes.io/projected/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f-kube-api-access-qtfbt\") on node \"crc\" DevicePath \"\"" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.445726 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerStarted","Data":"ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be"} Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.452275 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rz2qv" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.465427 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rz2qv" event={"ID":"56a31572-cf5c-4a2f-9f35-cd2c5b6a294f","Type":"ContainerDied","Data":"6b1a0550a31c37fac6b460bcf8543c4fe517d28714aee4f079b2f3c4fa7600c0"} Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.465501 4776 scope.go:117] "RemoveContainer" containerID="3046f3ff7e84f169e958658055d932d6954655cdba14a2f1add412a929c1465d" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.504168 4776 scope.go:117] "RemoveContainer" containerID="2db13339ea3e00a34fb3c5f32ad240a7d423a66fbe32a29f4fa243682ec3f7c6" Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.504629 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rz2qv"] Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.513855 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rz2qv"] Dec 04 11:09:33 crc kubenswrapper[4776]: I1204 11:09:33.539647 4776 scope.go:117] "RemoveContainer" containerID="a752fef20feef03acec5564d9b2bc0901eab3ab8d9a96c495d28cc5c4e9e1263" Dec 04 11:09:34 crc kubenswrapper[4776]: I1204 11:09:34.465582 4776 generic.go:334] "Generic (PLEG): container finished" podID="0a930c48-200d-48e1-9473-6b6faaf00524" containerID="ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be" exitCode=0 Dec 04 11:09:34 crc kubenswrapper[4776]: I1204 11:09:34.465681 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerDied","Data":"ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be"} Dec 04 11:09:35 crc kubenswrapper[4776]: I1204 11:09:35.466069 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" path="/var/lib/kubelet/pods/56a31572-cf5c-4a2f-9f35-cd2c5b6a294f/volumes" Dec 04 11:09:35 crc kubenswrapper[4776]: I1204 11:09:35.482287 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerStarted","Data":"7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f"} Dec 04 11:09:35 crc kubenswrapper[4776]: I1204 11:09:35.511852 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xst6x" podStartSLOduration=1.880433265 podStartE2EDuration="6.511823483s" podCreationTimestamp="2025-12-04 11:09:29 +0000 UTC" firstStartedPulling="2025-12-04 11:09:30.416713138 +0000 UTC m=+5415.283193515" lastFinishedPulling="2025-12-04 11:09:35.048103356 +0000 UTC m=+5419.914583733" observedRunningTime="2025-12-04 11:09:35.500646181 +0000 UTC m=+5420.367126578" watchObservedRunningTime="2025-12-04 11:09:35.511823483 +0000 UTC m=+5420.378303860" Dec 04 11:09:39 crc kubenswrapper[4776]: I1204 11:09:39.555378 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:39 crc kubenswrapper[4776]: I1204 11:09:39.556300 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:40 crc kubenswrapper[4776]: I1204 11:09:40.608312 4776 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xst6x" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="registry-server" probeResult="failure" output=< Dec 04 11:09:40 crc kubenswrapper[4776]: timeout: failed to connect service ":50051" within 1s Dec 04 11:09:40 crc kubenswrapper[4776]: > Dec 04 11:09:49 crc kubenswrapper[4776]: I1204 11:09:49.600854 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:49 crc kubenswrapper[4776]: I1204 11:09:49.661016 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:49 crc kubenswrapper[4776]: I1204 11:09:49.837883 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xst6x"] Dec 04 11:09:50 crc kubenswrapper[4776]: I1204 11:09:50.629701 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xst6x" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="registry-server" containerID="cri-o://7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f" gracePeriod=2 Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.078050 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.189613 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-utilities\") pod \"0a930c48-200d-48e1-9473-6b6faaf00524\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.189976 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-949cs\" (UniqueName: \"kubernetes.io/projected/0a930c48-200d-48e1-9473-6b6faaf00524-kube-api-access-949cs\") pod \"0a930c48-200d-48e1-9473-6b6faaf00524\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.190146 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-catalog-content\") pod \"0a930c48-200d-48e1-9473-6b6faaf00524\" (UID: \"0a930c48-200d-48e1-9473-6b6faaf00524\") " Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.191749 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-utilities" (OuterVolumeSpecName: "utilities") pod "0a930c48-200d-48e1-9473-6b6faaf00524" (UID: "0a930c48-200d-48e1-9473-6b6faaf00524"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.196588 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a930c48-200d-48e1-9473-6b6faaf00524-kube-api-access-949cs" (OuterVolumeSpecName: "kube-api-access-949cs") pod "0a930c48-200d-48e1-9473-6b6faaf00524" (UID: "0a930c48-200d-48e1-9473-6b6faaf00524"). InnerVolumeSpecName "kube-api-access-949cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.293445 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.293510 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-949cs\" (UniqueName: \"kubernetes.io/projected/0a930c48-200d-48e1-9473-6b6faaf00524-kube-api-access-949cs\") on node \"crc\" DevicePath \"\"" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.304375 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a930c48-200d-48e1-9473-6b6faaf00524" (UID: "0a930c48-200d-48e1-9473-6b6faaf00524"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.395803 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a930c48-200d-48e1-9473-6b6faaf00524-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.641767 4776 generic.go:334] "Generic (PLEG): container finished" podID="0a930c48-200d-48e1-9473-6b6faaf00524" containerID="7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f" exitCode=0 Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.641833 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerDied","Data":"7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f"} Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.641864 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xst6x" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.641906 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xst6x" event={"ID":"0a930c48-200d-48e1-9473-6b6faaf00524","Type":"ContainerDied","Data":"1d2a0082594c80e15bdc53a1ac7d6c0b5cc471bf5f999e77b9f3a48ce3ba3742"} Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.641974 4776 scope.go:117] "RemoveContainer" containerID="7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.668171 4776 scope.go:117] "RemoveContainer" containerID="ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.670159 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xst6x"] Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.689846 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xst6x"] Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.704625 4776 scope.go:117] "RemoveContainer" containerID="47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.743106 4776 scope.go:117] "RemoveContainer" containerID="7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f" Dec 04 11:09:51 crc kubenswrapper[4776]: E1204 11:09:51.743761 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f\": container with ID starting with 7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f not found: ID does not exist" containerID="7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.743870 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f"} err="failed to get container status \"7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f\": rpc error: code = NotFound desc = could not find container \"7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f\": container with ID starting with 7159fcf2d4b69baed38c8584cb20a5e538e4d3b86a8679607d2b1e12ed09388f not found: ID does not exist" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.743910 4776 scope.go:117] "RemoveContainer" containerID="ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be" Dec 04 11:09:51 crc kubenswrapper[4776]: E1204 11:09:51.744303 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be\": container with ID starting with ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be not found: ID does not exist" containerID="ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.744333 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be"} err="failed to get container status \"ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be\": rpc error: code = NotFound desc = could not find container \"ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be\": container with ID starting with ccada48ec653ebc63a19533486b426a1d9570c1c2e5f404b73fa79a7b1a982be not found: ID does not exist" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.744350 4776 scope.go:117] "RemoveContainer" containerID="47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623" Dec 04 11:09:51 crc kubenswrapper[4776]: E1204 11:09:51.744607 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623\": container with ID starting with 47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623 not found: ID does not exist" containerID="47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623" Dec 04 11:09:51 crc kubenswrapper[4776]: I1204 11:09:51.744632 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623"} err="failed to get container status \"47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623\": rpc error: code = NotFound desc = could not find container \"47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623\": container with ID starting with 47a627e271a5600f092b62d14c112ae1005845019335f08cc067c1a6dc98a623 not found: ID does not exist" Dec 04 11:09:53 crc kubenswrapper[4776]: I1204 11:09:53.517940 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" path="/var/lib/kubelet/pods/0a930c48-200d-48e1-9473-6b6faaf00524/volumes" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.446424 4776 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbngq"] Dec 04 11:10:48 crc kubenswrapper[4776]: E1204 11:10:48.447491 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="extract-utilities" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447506 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="extract-utilities" Dec 04 11:10:48 crc kubenswrapper[4776]: E1204 11:10:48.447522 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="registry-server" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447588 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="registry-server" Dec 04 11:10:48 crc kubenswrapper[4776]: E1204 11:10:48.447614 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="extract-utilities" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447622 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="extract-utilities" Dec 04 11:10:48 crc kubenswrapper[4776]: E1204 11:10:48.447635 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="extract-content" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447641 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="extract-content" Dec 04 11:10:48 crc kubenswrapper[4776]: E1204 11:10:48.447656 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="extract-content" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447662 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="extract-content" Dec 04 11:10:48 crc kubenswrapper[4776]: E1204 11:10:48.447673 4776 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="registry-server" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447679 4776 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="registry-server" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447863 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930c48-200d-48e1-9473-6b6faaf00524" containerName="registry-server" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.447881 4776 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a31572-cf5c-4a2f-9f35-cd2c5b6a294f" containerName="registry-server" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.449441 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.459902 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbngq"] Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.615343 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-utilities\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.615475 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-catalog-content\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.615593 4776 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgws9\" (UniqueName: \"kubernetes.io/projected/67c59795-50b3-482a-8639-6de8a1553323-kube-api-access-bgws9\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.717269 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgws9\" (UniqueName: \"kubernetes.io/projected/67c59795-50b3-482a-8639-6de8a1553323-kube-api-access-bgws9\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.717443 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-utilities\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.717529 4776 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-catalog-content\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.718332 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-utilities\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.718351 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-catalog-content\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.745844 4776 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgws9\" (UniqueName: \"kubernetes.io/projected/67c59795-50b3-482a-8639-6de8a1553323-kube-api-access-bgws9\") pod \"redhat-marketplace-sbngq\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:48 crc kubenswrapper[4776]: I1204 11:10:48.776203 4776 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:49 crc kubenswrapper[4776]: I1204 11:10:49.262722 4776 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbngq"] Dec 04 11:10:49 crc kubenswrapper[4776]: I1204 11:10:49.379778 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:10:49 crc kubenswrapper[4776]: I1204 11:10:49.380238 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:10:50 crc kubenswrapper[4776]: I1204 11:10:50.165758 4776 generic.go:334] "Generic (PLEG): container finished" podID="67c59795-50b3-482a-8639-6de8a1553323" containerID="4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4" exitCode=0 Dec 04 11:10:50 crc kubenswrapper[4776]: I1204 11:10:50.165829 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerDied","Data":"4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4"} Dec 04 11:10:50 crc kubenswrapper[4776]: I1204 11:10:50.165862 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerStarted","Data":"61b7970dd93e0775132ac47725dc229514c2182ffb72ce00fcf80e4d194c361e"} Dec 04 11:10:51 crc kubenswrapper[4776]: I1204 11:10:51.178440 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerStarted","Data":"681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac"} Dec 04 11:10:52 crc kubenswrapper[4776]: I1204 11:10:52.190298 4776 generic.go:334] "Generic (PLEG): container finished" podID="67c59795-50b3-482a-8639-6de8a1553323" containerID="681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac" exitCode=0 Dec 04 11:10:52 crc kubenswrapper[4776]: I1204 11:10:52.190352 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerDied","Data":"681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac"} Dec 04 11:10:53 crc kubenswrapper[4776]: I1204 11:10:53.212694 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerStarted","Data":"80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40"} Dec 04 11:10:53 crc kubenswrapper[4776]: I1204 11:10:53.250807 4776 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbngq" podStartSLOduration=2.775889895 podStartE2EDuration="5.250784544s" podCreationTimestamp="2025-12-04 11:10:48 +0000 UTC" firstStartedPulling="2025-12-04 11:10:50.168671608 +0000 UTC m=+5495.035151985" lastFinishedPulling="2025-12-04 11:10:52.643566257 +0000 UTC m=+5497.510046634" observedRunningTime="2025-12-04 11:10:53.23955393 +0000 UTC m=+5498.106034317" watchObservedRunningTime="2025-12-04 11:10:53.250784544 +0000 UTC m=+5498.117264921" Dec 04 11:10:58 crc kubenswrapper[4776]: I1204 11:10:58.776260 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:58 crc kubenswrapper[4776]: I1204 11:10:58.776745 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:58 crc kubenswrapper[4776]: I1204 11:10:58.818717 4776 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:59 crc kubenswrapper[4776]: I1204 11:10:59.312295 4776 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:10:59 crc kubenswrapper[4776]: I1204 11:10:59.357969 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbngq"] Dec 04 11:11:01 crc kubenswrapper[4776]: I1204 11:11:01.276936 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbngq" podUID="67c59795-50b3-482a-8639-6de8a1553323" containerName="registry-server" containerID="cri-o://80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40" gracePeriod=2 Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.271811 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.293026 4776 generic.go:334] "Generic (PLEG): container finished" podID="67c59795-50b3-482a-8639-6de8a1553323" containerID="80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40" exitCode=0 Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.293105 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerDied","Data":"80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40"} Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.293140 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbngq" event={"ID":"67c59795-50b3-482a-8639-6de8a1553323","Type":"ContainerDied","Data":"61b7970dd93e0775132ac47725dc229514c2182ffb72ce00fcf80e4d194c361e"} Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.293162 4776 scope.go:117] "RemoveContainer" containerID="80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.293351 4776 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbngq" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.328710 4776 scope.go:117] "RemoveContainer" containerID="681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.362267 4776 scope.go:117] "RemoveContainer" containerID="4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.401073 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-catalog-content\") pod \"67c59795-50b3-482a-8639-6de8a1553323\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.401155 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-utilities\") pod \"67c59795-50b3-482a-8639-6de8a1553323\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.401346 4776 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgws9\" (UniqueName: \"kubernetes.io/projected/67c59795-50b3-482a-8639-6de8a1553323-kube-api-access-bgws9\") pod \"67c59795-50b3-482a-8639-6de8a1553323\" (UID: \"67c59795-50b3-482a-8639-6de8a1553323\") " Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.402254 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-utilities" (OuterVolumeSpecName: "utilities") pod "67c59795-50b3-482a-8639-6de8a1553323" (UID: "67c59795-50b3-482a-8639-6de8a1553323"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.409569 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c59795-50b3-482a-8639-6de8a1553323-kube-api-access-bgws9" (OuterVolumeSpecName: "kube-api-access-bgws9") pod "67c59795-50b3-482a-8639-6de8a1553323" (UID: "67c59795-50b3-482a-8639-6de8a1553323"). InnerVolumeSpecName "kube-api-access-bgws9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.410863 4776 scope.go:117] "RemoveContainer" containerID="80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40" Dec 04 11:11:02 crc kubenswrapper[4776]: E1204 11:11:02.411664 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40\": container with ID starting with 80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40 not found: ID does not exist" containerID="80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.411846 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40"} err="failed to get container status \"80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40\": rpc error: code = NotFound desc = could not find container \"80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40\": container with ID starting with 80d9f406caacee0901b301ce95ec2084c96353cf854d9785550ab129eea60e40 not found: ID does not exist" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.411888 4776 scope.go:117] "RemoveContainer" containerID="681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac" Dec 04 11:11:02 crc kubenswrapper[4776]: E1204 11:11:02.412392 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac\": container with ID starting with 681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac not found: ID does not exist" containerID="681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.412445 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac"} err="failed to get container status \"681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac\": rpc error: code = NotFound desc = could not find container \"681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac\": container with ID starting with 681f76f1a8baa6671b288c7694646a40f8374a7cd275796a78a2b99d7b4f32ac not found: ID does not exist" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.412475 4776 scope.go:117] "RemoveContainer" containerID="4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4" Dec 04 11:11:02 crc kubenswrapper[4776]: E1204 11:11:02.412839 4776 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4\": container with ID starting with 4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4 not found: ID does not exist" containerID="4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.412876 4776 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4"} err="failed to get container status \"4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4\": rpc error: code = NotFound desc = could not find container \"4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4\": container with ID starting with 4b6a4b1ee6eb4e514ba92dfaa73d06f5648b0c6e758820de9cb6f95eec04ddb4 not found: ID does not exist" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.425489 4776 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67c59795-50b3-482a-8639-6de8a1553323" (UID: "67c59795-50b3-482a-8639-6de8a1553323"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.504833 4776 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgws9\" (UniqueName: \"kubernetes.io/projected/67c59795-50b3-482a-8639-6de8a1553323-kube-api-access-bgws9\") on node \"crc\" DevicePath \"\"" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.506041 4776 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.506169 4776 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67c59795-50b3-482a-8639-6de8a1553323-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.642303 4776 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbngq"] Dec 04 11:11:02 crc kubenswrapper[4776]: I1204 11:11:02.652081 4776 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbngq"] Dec 04 11:11:03 crc kubenswrapper[4776]: I1204 11:11:03.465317 4776 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c59795-50b3-482a-8639-6de8a1553323" path="/var/lib/kubelet/pods/67c59795-50b3-482a-8639-6de8a1553323/volumes" Dec 04 11:11:19 crc kubenswrapper[4776]: I1204 11:11:19.383325 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:11:19 crc kubenswrapper[4776]: I1204 11:11:19.383984 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.379828 4776 patch_prober.go:28] interesting pod/machine-config-daemon-d6wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.380423 4776 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.380473 4776 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.381318 4776 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"608367ee3a08b840457ffc1fb6d507de2776bc3f97adac1654f0ffc37ddf6373"} pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.381386 4776 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" podUID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerName="machine-config-daemon" containerID="cri-o://608367ee3a08b840457ffc1fb6d507de2776bc3f97adac1654f0ffc37ddf6373" gracePeriod=600 Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.792938 4776 generic.go:334] "Generic (PLEG): container finished" podID="a57f7940-a976-4c85-bcb7-a1c24ba08266" containerID="608367ee3a08b840457ffc1fb6d507de2776bc3f97adac1654f0ffc37ddf6373" exitCode=0 Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.792956 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerDied","Data":"608367ee3a08b840457ffc1fb6d507de2776bc3f97adac1654f0ffc37ddf6373"} Dec 04 11:11:49 crc kubenswrapper[4776]: I1204 11:11:49.793313 4776 scope.go:117] "RemoveContainer" containerID="7f46c4f522b43e6cb433ef167b0ef32c606af59ec4f3d412624ce6dc78c290ce" Dec 04 11:11:50 crc kubenswrapper[4776]: I1204 11:11:50.803968 4776 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d6wbt" event={"ID":"a57f7940-a976-4c85-bcb7-a1c24ba08266","Type":"ContainerStarted","Data":"28a2d654213984289f93230c492467312ac7f71a1da8f3c2370e561d71e5ee44"}